Dogs vs. Cats

项目来自于Kaggle的Dogs vs. Cats Redux:Kernel Edition竞赛。 训练数据可从这里下载

In [1]:
# import PIL
# from PIL import Image
import numpy as np
from tqdm import tqdm
from sklearn.model_selection import train_test_split
import pickle
import os
import math
import random
import zipfile 
import tensorflow as tf
import math
import matplotlib.pyplot as plt

from keras.models import *
from keras.layers import *
from keras.applications import *
from keras.preprocessing.image import *
from helper import *
from keras import optimizers

BATCH_SIZE = 128
%matplotlib inline
/home/ubuntu/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Using TensorFlow backend.

数据处理

Kaggle的数据分为train和test两个数据包。解压train.zip后数据形式为:

  • train/cat.{index}.jpg
  • train/dog.{index}.jpg

猫狗各1,2500张图片(index 从0到12499)。每张图片尺寸不一,初步先预处理成200x200像素

解压缩 train.zip 和 test.zip 数据

In [3]:
if os.path.exists("data/train"):  
    print("Skip Unzipping cause folder ‘data/train’ is already exists.") 
else:  
#     os.makedirs("data/")  
    
    zip_file = zipfile.ZipFile("data/train.zip")  
    for names in tqdm(zip_file.namelist()):  
        zip_file.extract(names,"data/")  
    zip_file.close()  
    
    zip_file = zipfile.ZipFile("data/test.zip")  
    for names in tqdm(zip_file.namelist()):  
        zip_file.extract(names,"data/")  
    zip_file.close()  
    
    print("All Data unzipped")
100%|██████████| 25001/25001 [00:15<00:00, 1620.88it/s]
100%|██████████| 12501/12501 [00:08<00:00, 1464.34it/s]
All Data unzipped

重新调整训练与测试数据文件结构

由于Keras的ImageGenerator需要特定的文件夹结构来读取分类信息,因此调整文件夹结构

In [12]:
import shutil

# data/
#     train2/
#         dog/
#             dog.1.jpg
#             ...
#         cat/
#             cat.1.jpg
#             ...
#     test2/
#         test/
#             cat.1.jpg
#             dog.1.jpg
#             ...


if not os.path.exists("data/train2"):
    train_filenames = os.listdir('data/train')

    filter_train_cat = filter(lambda x:x[:3] == "cat", train_filenames)
    filter_train_dog = filter(lambda x:x[:3] == "dog", train_filenames)
#     shutil.rmtree("data/train2")
    os.mkdir("data/train2")
    os.mkdir("data/train2/cat")
    os.mkdir("data/train2/dog")


    ##由于os.symlink创建的快捷方式在mac 本地无法读取,因此使用copy的方法生成图片
    for filename in filter_train_cat:
        shutil.copy("data/train/{}".format(filename),"data/train2/cat/{}".format(filename))

    for filename in filter_train_dog:
        shutil.copy("data/train/{}".format(filename),"data/train2/dog/{}".format(filename))


if not os.path.exists("data/test2"):
    shutil.copytree("data/test","data/test2/test")

print("Done")
Done
In [2]:
# data/splitted
#     train/
#         dog/
#             dog.1.jpg
#             ...
#         cat/
#             cat.1.jpg
#             ...
#     valid/
#         dog/
#             dog.1.jpg
#             ...
#         cat/
#             cat.1.jpg
#             ...

if not os.path.exists("data/splitted/train"):
    all_fileanmes = os.listdir('data/train')

    filter_cat = filter(lambda x:x[:3] == "cat", all_fileanmes)
    filter_dog = filter(lambda x:x[:3] == "dog", all_fileanmes)
    
    all_cats=[x for x in filter_cat]
    all_dogs=[x for x in filter_dog]
    
    train_cat,valid_cat = train_test_split(all_cats,test_size=0.2,random_state=2018)
    train_dog,valid_dog = train_test_split(all_dogs,test_size=0.2,random_state=2018)
       
    os.makedirs("data/splitted/train/dog")
    os.makedirs("data/splitted/train/cat")
    os.makedirs("data/splitted/valid/dog")
    os.makedirs("data/splitted/valid/cat")
    
    for filename in train_cat:
        shutil.copy("data/train/{}".format(filename),"data/splitted/train/cat/{}".format(filename))
    
    for filename in valid_cat:
        shutil.copy("data/train/{}".format(filename),"data/splitted/valid/cat/{}".format(filename))

    for filename in train_dog:
        shutil.copy("data/train/{}".format(filename),"data/splitted/train/dog/{}".format(filename))
    
    for filename in valid_dog:
        shutil.copy("data/train/{}".format(filename),"data/splitted/valid/dog/{}".format(filename))

print("Done")   
Done

迁移学习-完整迁移

迁移"VGG16","ResNet50","InceptionV3","Xception","InceptionResNetV2"对imagenet预训练的模型(不包括输出层),添加简单的GlobalAverage和dropout作为输出层。即被迁移的模型用来特征抓取,自建的输出层用来分类。

导出向量特征

首先使用被迁移的模型抓取图片特征,为了加快之后的分类器训练速度,把这些特征保存为向量文件。避免以后每次训练分类器时都要重复提取特征。

向量特征文件保存在 feature_extract/original_%modelname%.h5

In [4]:
make_dir_not_exist("feature_extract")
extract_feature_by_pretrainedModel(ResNet50, (224, 224),resnet50.preprocess_input)
extract_feature_by_pretrainedModel(InceptionV3, (299, 299), inception_v3.preprocess_input)
extract_feature_by_pretrainedModel(Xception, (299, 299), xception.preprocess_input)
extract_feature_by_pretrainedModel(InceptionResNetV2, (299, 299), inception_resnet_v2.preprocess_input)
extract_feature_by_pretrainedModel(VGG16,(224,224),vgg16.preprocess_input)

print("All Model Features Extracted")
========= Extract Featrues by InceptionV3 With GlobalAveragePool==========
Found 25000 images belonging to 2 classes.
Found 12500 images belonging to 1 classes.
Extracting features from train data by InceptionV3 With GlobalAveragePool ...
196/196 [==============================] - 322s 2s/step
Extracting features from test data by InceptionV3 With GlobalAveragePool ...
98/98 [==============================] - 160s 2s/step
saving features to feature_extract/original_InceptionV3.h5
Feature saved for InceptionV3 With GlobalAveragePool

========= Extract Featrues by Xception With GlobalAveragePool==========
Found 25000 images belonging to 2 classes.
Found 12500 images belonging to 1 classes.
Extracting features from train data by Xception With GlobalAveragePool ...
196/196 [==============================] - 514s 3s/step
Extracting features from test data by Xception With GlobalAveragePool ...
98/98 [==============================] - 257s 3s/step
saving features to feature_extract/original_Xception.h5
Feature saved for Xception With GlobalAveragePool

========= Extract Featrues by InceptionResNetV2 With GlobalAveragePool==========
Found 25000 images belonging to 2 classes.
Found 12500 images belonging to 1 classes.
Extracting features from train data by InceptionResNetV2 With GlobalAveragePool ...
196/196 [==============================] - 603s 3s/step
Extracting features from test data by InceptionResNetV2 With GlobalAveragePool ...
98/98 [==============================] - 300s 3s/step
saving features to feature_extract/original_InceptionResNetV2.h5
Feature saved for InceptionResNetV2 With GlobalAveragePool

========= Extract Featrues by VGG16 With GlobalAveragePool==========
Found 25000 images belonging to 2 classes.
Found 12500 images belonging to 1 classes.
Extracting features from train data by VGG16 With GlobalAveragePool ...
196/196 [==============================] - 266s 1s/step
Extracting features from test data by VGG16 With GlobalAveragePool ...
98/98 [==============================] - 132s 1s/step
saving features to feature_extract/original_VGG16.h5
Feature saved for VGG16 With GlobalAveragePool

All Model Features Extracted

训练模型-完整迁移

逐个使用"VGG16","ResNet50","InceptionV3","Xception","InceptionResNetV2"抓取的特征向量,来训练分类器。并使用Tensorboard记录accuracy与loss。

Tensorboard Log文件保存在:

  • logs/original/%modelname% - adadelta优化
  • logs/original/%modelname%_sgd_lrxx_decayxx_momxx - SGD优化

Model 文件保存在 model/original_%modelname%_output

In [15]:
models = {}

learning_rate = 0.001
decay = 0
momentum=0.9
optimizer = optimizers.SGD(lr=learning_rate, decay=decay, momentum=momentum,nesterov=True)

# optimizer = "adadelta"

for model_name in ["ResNet50"]:
    tensorboard_directory = "logs/original/%s_sgd_lr%s_decay%s_mom%s"%(model_name,learning_rate,decay,momentum)
    del_file_if_exist(tensorboard_directory)
    callbacks=[TensorBoard(tensorboard_directory,batch_size=BATCH_SIZE)]
    
    # i.e. models["VGG"]=(model,history)
    models[model_name] = fit_model_original_ouput(model_name,
                               epochs=40,
                               callbacks=callbacks,
                               auto_save=True,
                                optimizer=optimizer)
load features from 「original_ResNet50.h5」
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/40
20000/20000 [==============================] - 1s 38us/step - loss: 0.1486 - acc: 0.9410 - val_loss: 0.0553 - val_acc: 0.9808
Epoch 2/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0673 - acc: 0.9773 - val_loss: 0.0453 - val_acc: 0.9844
Epoch 3/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0574 - acc: 0.9792 - val_loss: 0.0418 - val_acc: 0.9852
Epoch 4/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0530 - acc: 0.9806 - val_loss: 0.0397 - val_acc: 0.9860
Epoch 5/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0482 - acc: 0.9824 - val_loss: 0.0384 - val_acc: 0.9864
Epoch 6/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0458 - acc: 0.9829 - val_loss: 0.0375 - val_acc: 0.9866
Epoch 7/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0432 - acc: 0.9840 - val_loss: 0.0366 - val_acc: 0.9868
Epoch 8/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0445 - acc: 0.9848 - val_loss: 0.0355 - val_acc: 0.9870
Epoch 9/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0436 - acc: 0.9844 - val_loss: 0.0363 - val_acc: 0.9866
Epoch 10/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0443 - acc: 0.9833 - val_loss: 0.0350 - val_acc: 0.9870
Epoch 11/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0410 - acc: 0.9856 - val_loss: 0.0347 - val_acc: 0.9872
Epoch 12/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0406 - acc: 0.9854 - val_loss: 0.0349 - val_acc: 0.9868
Epoch 13/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0397 - acc: 0.9861 - val_loss: 0.0340 - val_acc: 0.9882
Epoch 14/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0407 - acc: 0.9852 - val_loss: 0.0341 - val_acc: 0.9878
Epoch 15/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0406 - acc: 0.9854 - val_loss: 0.0347 - val_acc: 0.9870
Epoch 16/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0380 - acc: 0.9859 - val_loss: 0.0345 - val_acc: 0.9876
Epoch 17/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0387 - acc: 0.9859 - val_loss: 0.0335 - val_acc: 0.9888
Epoch 18/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0383 - acc: 0.9870 - val_loss: 0.0342 - val_acc: 0.9876
Epoch 19/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0366 - acc: 0.9862 - val_loss: 0.0334 - val_acc: 0.9884
Epoch 20/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0373 - acc: 0.9856 - val_loss: 0.0333 - val_acc: 0.9888
Epoch 21/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0372 - acc: 0.9860 - val_loss: 0.0353 - val_acc: 0.9874
Epoch 22/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0363 - acc: 0.9869 - val_loss: 0.0333 - val_acc: 0.9876
Epoch 23/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0364 - acc: 0.9863 - val_loss: 0.0327 - val_acc: 0.9886
Epoch 24/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0361 - acc: 0.9867 - val_loss: 0.0337 - val_acc: 0.9882
Epoch 25/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0346 - acc: 0.9866 - val_loss: 0.0334 - val_acc: 0.9880
Epoch 26/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0343 - acc: 0.9877 - val_loss: 0.0325 - val_acc: 0.9882
Epoch 27/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0344 - acc: 0.9881 - val_loss: 0.0326 - val_acc: 0.9882
Epoch 28/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0349 - acc: 0.9877 - val_loss: 0.0326 - val_acc: 0.9884
Epoch 29/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0334 - acc: 0.9882 - val_loss: 0.0321 - val_acc: 0.9888
Epoch 30/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0338 - acc: 0.9873 - val_loss: 0.0329 - val_acc: 0.9882
Epoch 31/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0350 - acc: 0.9883 - val_loss: 0.0335 - val_acc: 0.9880
Epoch 32/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0333 - acc: 0.9882 - val_loss: 0.0332 - val_acc: 0.9882
Epoch 33/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0338 - acc: 0.9873 - val_loss: 0.0317 - val_acc: 0.9886
Epoch 34/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0323 - acc: 0.9889 - val_loss: 0.0321 - val_acc: 0.9884
Epoch 35/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0319 - acc: 0.9890 - val_loss: 0.0327 - val_acc: 0.9884
Epoch 36/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0334 - acc: 0.9879 - val_loss: 0.0320 - val_acc: 0.9884
Epoch 37/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0321 - acc: 0.9894 - val_loss: 0.0320 - val_acc: 0.9888
Epoch 38/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0329 - acc: 0.9882 - val_loss: 0.0330 - val_acc: 0.9878
Epoch 39/40
20000/20000 [==============================] - 1s 29us/step - loss: 0.0328 - acc: 0.9877 - val_loss: 0.0317 - val_acc: 0.9890
Epoch 40/40
20000/20000 [==============================] - 1s 30us/step - loss: 0.0331 - acc: 0.9879 - val_loss: 0.0322 - val_acc: 0.9882

测试模型并导出CSV

In [16]:
for model_name in ["ResNet50"]:
    model = models[model_name][0]
    X_test = load_test_data(model_name)
    test_to_csv(model,X_test,get_filename(model_name,"original","sgd_lr0.001_mom_0.9",ext="csv"))
load test data from 「original_ResNet50.h5」
12500/12500 [==============================] - 0s 38us/step
Found 12500 images belonging to 1 classes.
   id  label
0   1  0.995
1   2  0.995
2   3  0.995
3   4  0.995
4   5  0.005
5   6  0.005
6   7  0.005
7   8  0.005
8   9  0.005
9  10  0.005
CSV文件已保存至csv_output/original_ResNet50_sgd_lr0.001_mom_0.9.csv
/home/ubuntu/cats_dogs/helper.py:486: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
  df.set_value(index - 1, 'label', predict[i])

迁移学习-合并迁移模型

尝试使用多个预训练模型作为特征提取工具,集合多个模型输出的特征向量,作为分类器的输入。从而提高分类的准确度,降低loss值

训练合并模型-Adadelta

In [9]:
from keras.callbacks import *
# merge_models_name = ["InceptionV3","Xception","ResNet50","VGG16","InceptionResNetV2"]
merge_models_name = ["InceptionV3","Xception","ResNet50","InceptionResNetV2"]
merged_name = "_".join(merge_models_name)
tensorboard_directory = "logs/merge/%s_epoch%s"%(merged_name,7)
del_file_if_exist(tensorboard_directory)

Xs,y = load_train_data_merge(model_names=merge_models_name)

model,history = fit_model_merge_output(Xs,y,
                                       merged_name = merged_name,
                                        epochs=7,
                                        callbacks=[TensorBoard(tensorboard_directory,batch_size=BATCH_SIZE)],
                                      auto_save=True)

 
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/7
20000/20000 [==============================] - 1s 45us/step - loss: 0.0717 - acc: 0.9755 - val_loss: 0.0237 - val_acc: 0.9920
Epoch 2/7
20000/20000 [==============================] - 1s 41us/step - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 3/7
20000/20000 [==============================] - 1s 41us/step - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 4/7
20000/20000 [==============================] - 1s 41us/step - loss: 0.0153 - acc: 0.9951 - val_loss: 0.0174 - val_acc: 0.9936
Epoch 5/7
20000/20000 [==============================] - 1s 41us/step - loss: 0.0130 - acc: 0.9959 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 6/7
20000/20000 [==============================] - 1s 41us/step - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 7/7
20000/20000 [==============================] - 1s 41us/step - loss: 0.0108 - acc: 0.9967 - val_loss: 0.0165 - val_acc: 0.9946

绘制模型结构

In [10]:
# 图形化输出模型结构
SVG(model_to_dot(model, show_shapes=True).create(prog='dot', format='svg'))
Out[10]:
G 139977018621176 input_3: InputLayer input: output: (None, 2048) (None, 2048) 139977018621120 concatenate_2: Concatenate input: output: [(None, 2048), (None, 2048)] (None, 4096) 139977018621176->139977018621120 139977018621792 input_4: InputLayer input: output: (None, 2048) (None, 2048) 139977018621792->139977018621120 139977018621680 dropout_2: Dropout input: output: (None, 4096) (None, 4096) 139977018621120->139977018621680 139977017845072 dense_2: Dense input: output: (None, 4096) (None, 1) 139977018621680->139977017845072

合并模型训练 - 完善SGD

In [9]:
# merge_models_name = ["InceptionV3","Xception","ResNet50","VGG16","InceptionResNetV2"]
merge_models_name = ["InceptionV3","Xception","ResNet50"]

Xs,y = load_train_data_merge(model_names=merge_models_name)
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
load features from 「original_InceptionResNetV2.h5」
In [17]:
learn_rates=[0.001,0.0001]
momentums = [0.2,0.5,0.9]
decays = [1e-4,1e-5,1e-6]
max_epochs = 500
earlystop_patience = 7

#因为需要用Tensorboard 作为 callback记录数据,而在Sklearn的Gridsearch中不支持callback,所以这里手动循环参数,用tensorboard记录并寻找最佳的参数组合。
merged_name = "_".join(merge_models_name)
best_model = None
history_minloss = 100
for learning_rate in learn_rates:
    for momentum in momentums:
        for decay in decays:
            
            sgd = optimizers.SGD(lr=learning_rate, decay=decay, momentum=momentum,nesterov=True)

            tensorboard_directory = "logs/merge/%s_lr%s_decay%s_momentum%s_nesterov"%(merged_name,learning_rate,decay,momentum)
            del_file_if_exist(tensorboard_directory)
            tensorboard = TensorBoard(tensorboard_directory,batch_size=BATCH_SIZE)
            earlyStop = EarlyStopping(monitor='val_loss', patience=earlystop_patience, verbose=1)
            checkpoint = ModelCheckpoint("model/merge_checkpoint/%s_lr%s_decay%s_momentum%s_nesterov"%(merged_name,learning_rate,decay,momentum),
                            monitor='val_loss', save_best_only=True, verbose=0)
            
            callbacks=[tensorboard,earlyStop,checkpoint]
            
            # i.e. models["VGG"]=(model,history)

            print("\nStart Fitting Merged Model %s \nwith lr:%s ,mom:%s , decay:%s"%(merged_name,learning_rate,momentum,decay))
            
            model,history = fit_model_merge_output(Xs,y,
                                       merged_name = merged_name,
                                        optimizer=sgd,
                                        epochs=max_epochs,
                                        callbacks=callbacks,
                                      auto_save=False)
            
            last_val_loss = history.history['val_loss'][-1]
            if history_minloss>last_val_loss:
                best_model = model
                history_minloss = last_val_loss
                print("The model is better than ever!")
                best_model.save("model/%s" % get_filename(merged_name, "merge", "output_sgd"))
            else:
                print("The model is not good enough")
Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.2 , decay:0.0001
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.2280 - acc: 0.9331 - val_loss: 0.0889 - val_acc: 0.9868
Epoch 2/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0875 - acc: 0.9815 - val_loss: 0.0588 - val_acc: 0.9894
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0637 - acc: 0.9867 - val_loss: 0.0474 - val_acc: 0.9910
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0521 - acc: 0.9888 - val_loss: 0.0410 - val_acc: 0.9918
Epoch 5/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0474 - acc: 0.9882 - val_loss: 0.0369 - val_acc: 0.9922
Epoch 6/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0424 - acc: 0.9897 - val_loss: 0.0339 - val_acc: 0.9926
Epoch 7/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0378 - acc: 0.9911 - val_loss: 0.0317 - val_acc: 0.9930
Epoch 8/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0369 - acc: 0.9909 - val_loss: 0.0300 - val_acc: 0.9932
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0350 - acc: 0.9914 - val_loss: 0.0287 - val_acc: 0.9930
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0321 - acc: 0.9916 - val_loss: 0.0266 - val_acc: 0.9932
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0324 - acc: 0.9910 - val_loss: 0.0257 - val_acc: 0.9936
Epoch 13/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0302 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9938
Epoch 14/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0296 - acc: 0.9914 - val_loss: 0.0243 - val_acc: 0.9940
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0283 - acc: 0.9926 - val_loss: 0.0238 - val_acc: 0.9940
Epoch 16/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0285 - acc: 0.9914 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0229 - val_acc: 0.9940
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0274 - acc: 0.9920 - val_loss: 0.0226 - val_acc: 0.9940
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9923 - val_loss: 0.0216 - val_acc: 0.9938
Epoch 22/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0249 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 25/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0234 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9936
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0202 - val_acc: 0.9940
Epoch 29/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0200 - val_acc: 0.9940
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 31/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0221 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9946
Epoch 32/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9946
Epoch 33/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 34/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 37/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 38/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 41/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 42/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 43/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 44/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 45/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0203 - acc: 0.9943 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 46/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 48/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 49/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 51/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 55/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0197 - acc: 0.9939 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 56/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9936 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 58/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9939 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 60/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9953 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 62/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 63/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9948 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 66/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 68/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 69/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 70/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 72/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9954 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 76/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 77/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 78/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 79/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9945 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 85/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 87/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 88/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 89/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 91/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 94/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 95/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 96/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 97/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9954
Epoch 99/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9954
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 101/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 102/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 103/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9954
Epoch 104/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 105/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 106/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 108/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9960
Epoch 109/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9960
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 111/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0161 - val_acc: 0.9958
Epoch 113/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 114/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9958
Epoch 115/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0160 - val_acc: 0.9960
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9956 - val_loss: 0.0160 - val_acc: 0.9958
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 118/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9954
Epoch 120/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 121/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 122/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9946 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 123/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 124/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 125/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 127/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 128/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 129/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 130/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0159 - val_acc: 0.9956
Epoch 131/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 132/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 133/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 134/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 135/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 136/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 137/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 138/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 139/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0155 - acc: 0.9959 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 140/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 141/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 142/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0158 - val_acc: 0.9956
Epoch 143/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 144/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 145/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0150 - acc: 0.9957 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 146/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 147/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 148/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 149/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 150/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 151/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 152/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9947 - val_loss: 0.0157 - val_acc: 0.9956
Epoch 153/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 154/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9952 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 155/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 156/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 158/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0148 - acc: 0.9959 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 159/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 160/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9952 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 161/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 162/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 163/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 164/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9942 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 165/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9957 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 166/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 167/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 168/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 169/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 170/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 171/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 172/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 173/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0149 - acc: 0.9952 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 174/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 175/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 176/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 177/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9958 - val_loss: 0.0155 - val_acc: 0.9958
Epoch 178/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 179/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 180/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0155 - val_acc: 0.9958
Epoch 181/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0155 - val_acc: 0.9958
Epoch 182/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 183/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9958
Epoch 184/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 185/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 186/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 187/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 188/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 190/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 191/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 192/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 193/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0149 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9958
Epoch 00193: early stopping
The model is better than ever!

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.2 , decay:1e-05
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.2284 - acc: 0.9317 - val_loss: 0.0907 - val_acc: 0.9884
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0872 - acc: 0.9831 - val_loss: 0.0590 - val_acc: 0.9912
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0621 - acc: 0.9877 - val_loss: 0.0473 - val_acc: 0.9918
Epoch 4/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0513 - acc: 0.9884 - val_loss: 0.0406 - val_acc: 0.9924
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0454 - acc: 0.9900 - val_loss: 0.0365 - val_acc: 0.9924
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0415 - acc: 0.9902 - val_loss: 0.0334 - val_acc: 0.9928
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0395 - acc: 0.9896 - val_loss: 0.0312 - val_acc: 0.9928
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0348 - acc: 0.9917 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0343 - acc: 0.9915 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0317 - acc: 0.9920 - val_loss: 0.0268 - val_acc: 0.9936
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9920 - val_loss: 0.0260 - val_acc: 0.9940
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9926 - val_loss: 0.0251 - val_acc: 0.9940
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9942
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 15/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0273 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 17/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9935 - val_loss: 0.0223 - val_acc: 0.9946
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9936 - val_loss: 0.0220 - val_acc: 0.9946
Epoch 19/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 20/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0239 - acc: 0.9930 - val_loss: 0.0213 - val_acc: 0.9946
Epoch 21/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0210 - val_acc: 0.9946
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9946
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0204 - val_acc: 0.9946
Epoch 25/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 26/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0198 - val_acc: 0.9946
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9939 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 30/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 31/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 33/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0200 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 35/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 37/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 38/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9942 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 41/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0194 - acc: 0.9943 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 43/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 47/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0183 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9954
Epoch 48/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 49/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9953 - val_loss: 0.0170 - val_acc: 0.9954
Epoch 53/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9954
Epoch 54/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9954
Epoch 55/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 56/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 57/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 58/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 59/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 60/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 61/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0172 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 62/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 63/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9954
Epoch 64/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9954
Epoch 65/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9954
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9954
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9954
Epoch 68/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 69/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 70/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 71/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9954
Epoch 72/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 74/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0159 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 75/500
20000/20000 [==============================] - 1s 43us/step - loss: 0.0157 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 76/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 77/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 78/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 80/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0155 - acc: 0.9957 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 81/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0154 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 82/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 83/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9954
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0159 - val_acc: 0.9952
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0159 - val_acc: 0.9954
Epoch 87/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 88/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0159 - val_acc: 0.9952
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 90/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 92/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 93/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 94/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 95/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 96/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 97/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 99/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 101/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 102/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0151 - acc: 0.9956 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 103/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9960 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 104/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 105/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 106/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0141 - acc: 0.9959 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 108/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 109/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0144 - acc: 0.9959 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 113/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 114/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 115/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 116/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9954 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 117/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 118/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 120/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 121/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 122/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 123/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 124/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0146 - acc: 0.9950 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 125/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0137 - acc: 0.9953 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 126/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 127/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 128/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 129/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 130/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0136 - acc: 0.9954 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 131/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 132/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 133/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 134/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 135/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 136/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 137/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 138/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0132 - acc: 0.9954 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 139/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0131 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 140/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 141/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0127 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 142/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 143/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 144/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 145/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 146/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 147/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 148/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 149/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 150/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0132 - acc: 0.9955 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 151/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 152/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 153/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 154/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 155/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0130 - acc: 0.9959 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 156/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0127 - acc: 0.9967 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 00157: early stopping
The model is better than ever!

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.2 , decay:1e-06
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 57us/step - loss: 0.2365 - acc: 0.9282 - val_loss: 0.0889 - val_acc: 0.9864
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0836 - acc: 0.9845 - val_loss: 0.0573 - val_acc: 0.9892
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0605 - acc: 0.9879 - val_loss: 0.0453 - val_acc: 0.9910
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0491 - acc: 0.9898 - val_loss: 0.0390 - val_acc: 0.9916
Epoch 5/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0443 - acc: 0.9901 - val_loss: 0.0350 - val_acc: 0.9920
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0385 - acc: 0.9915 - val_loss: 0.0323 - val_acc: 0.9924
Epoch 7/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0359 - acc: 0.9917 - val_loss: 0.0302 - val_acc: 0.9926
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0341 - acc: 0.9918 - val_loss: 0.0284 - val_acc: 0.9932
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9923 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0310 - acc: 0.9924 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9922 - val_loss: 0.0249 - val_acc: 0.9938
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9926 - val_loss: 0.0242 - val_acc: 0.9940
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9923 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0260 - acc: 0.9932 - val_loss: 0.0231 - val_acc: 0.9940
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0266 - acc: 0.9921 - val_loss: 0.0225 - val_acc: 0.9940
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9938 - val_loss: 0.0216 - val_acc: 0.9940
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9930 - val_loss: 0.0213 - val_acc: 0.9940
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9932 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 21/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9940
Epoch 23/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 25/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 26/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 31/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 32/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9943 - val_loss: 0.0179 - val_acc: 0.9944
Epoch 34/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0195 - acc: 0.9943 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 37/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 38/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0175 - val_acc: 0.9944
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 41/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 43/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9948 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 45/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 49/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0179 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 51/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 52/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 53/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 55/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 56/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9948
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 62/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 63/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0159 - val_acc: 0.9952
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0159 - val_acc: 0.9952
Epoch 67/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 68/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0159 - val_acc: 0.9952
Epoch 69/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9952
Epoch 70/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 71/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 72/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0149 - acc: 0.9958 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9949 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 75/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 76/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 77/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9948 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 79/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 80/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 81/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 83/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 85/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 87/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0143 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 88/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 91/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0142 - acc: 0.9952 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 92/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 93/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0151 - acc: 0.9950 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 94/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0146 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 95/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9952 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 97/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 98/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 99/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9956
Epoch 100/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9957 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 101/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0135 - acc: 0.9963 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 102/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 103/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0150 - val_acc: 0.9956
Epoch 104/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0150 - val_acc: 0.9956
Epoch 105/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0149 - val_acc: 0.9954
Epoch 106/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0140 - acc: 0.9954 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 107/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0150 - val_acc: 0.9956
Epoch 108/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0149 - val_acc: 0.9956
Epoch 109/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0150 - val_acc: 0.9956
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0149 - val_acc: 0.9958
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0149 - val_acc: 0.9958
Epoch 112/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0133 - acc: 0.9961 - val_loss: 0.0149 - val_acc: 0.9958
Epoch 113/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0150 - val_acc: 0.9956
Epoch 114/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0134 - acc: 0.9957 - val_loss: 0.0149 - val_acc: 0.9958
Epoch 115/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0150 - val_acc: 0.9956
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0148 - val_acc: 0.9956
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0149 - val_acc: 0.9956
Epoch 118/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0148 - val_acc: 0.9956
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 120/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 121/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 122/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 123/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0148 - val_acc: 0.9956
Epoch 124/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 125/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 127/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9957 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 128/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0146 - val_acc: 0.9956
Epoch 129/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0146 - val_acc: 0.9956
Epoch 130/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 131/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0131 - acc: 0.9955 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 132/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0147 - val_acc: 0.9958
Epoch 133/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0147 - val_acc: 0.9958
Epoch 134/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0147 - val_acc: 0.9958
Epoch 135/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 136/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 00136: early stopping
The model is better than ever!

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.5 , decay:0.0001
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 59us/step - loss: 0.1938 - acc: 0.9374 - val_loss: 0.0702 - val_acc: 0.9858
Epoch 2/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0681 - acc: 0.9849 - val_loss: 0.0475 - val_acc: 0.9902
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0497 - acc: 0.9876 - val_loss: 0.0390 - val_acc: 0.9910
Epoch 4/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0435 - acc: 0.9900 - val_loss: 0.0339 - val_acc: 0.9926
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0381 - acc: 0.9910 - val_loss: 0.0308 - val_acc: 0.9930
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0347 - acc: 0.9905 - val_loss: 0.0286 - val_acc: 0.9934
Epoch 7/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0319 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9940
Epoch 8/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0307 - acc: 0.9922 - val_loss: 0.0256 - val_acc: 0.9942
Epoch 9/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9944
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9927 - val_loss: 0.0236 - val_acc: 0.9946
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9922 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9919 - val_loss: 0.0224 - val_acc: 0.9946
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9934 - val_loss: 0.0220 - val_acc: 0.9946
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9924 - val_loss: 0.0212 - val_acc: 0.9946
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0209 - val_acc: 0.9946
Epoch 16/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0235 - acc: 0.9934 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 17/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9948
Epoch 20/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 21/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0220 - acc: 0.9928 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9942 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 25/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 28/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 29/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 30/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 31/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9954
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0176 - val_acc: 0.9954
Epoch 33/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0194 - acc: 0.9939 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 36/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 37/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 38/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 39/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 41/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 42/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 43/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0176 - acc: 0.9943 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 44/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 45/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0176 - acc: 0.9943 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 46/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9954
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 48/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 49/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 50/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 52/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 53/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 56/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9958
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9958
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9958
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9958
Epoch 60/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9958
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9955 - val_loss: 0.0161 - val_acc: 0.9958
Epoch 62/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0167 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9958
Epoch 63/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9958
Epoch 64/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9958
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9958
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9958
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9958
Epoch 68/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9958
Epoch 69/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0161 - val_acc: 0.9958
Epoch 70/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0160 - val_acc: 0.9958
Epoch 71/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9958
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9958
Epoch 73/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9952 - val_loss: 0.0159 - val_acc: 0.9958
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0159 - val_acc: 0.9958
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9958
Epoch 76/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9958
Epoch 77/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9958
Epoch 78/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0158 - val_acc: 0.9958
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9958
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9958
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9952 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 82/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9949 - val_loss: 0.0156 - val_acc: 0.9958
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0156 - val_acc: 0.9958
Epoch 87/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 88/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9958
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0156 - val_acc: 0.9958
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0157 - val_acc: 0.9958
Epoch 00093: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.5 , decay:1e-05
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 57us/step - loss: 0.1839 - acc: 0.9469 - val_loss: 0.0702 - val_acc: 0.9880
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0663 - acc: 0.9861 - val_loss: 0.0476 - val_acc: 0.9900
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0495 - acc: 0.9890 - val_loss: 0.0382 - val_acc: 0.9922
Epoch 4/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0418 - acc: 0.9901 - val_loss: 0.0334 - val_acc: 0.9928
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0371 - acc: 0.9904 - val_loss: 0.0301 - val_acc: 0.9930
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0339 - acc: 0.9913 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0315 - acc: 0.9918 - val_loss: 0.0261 - val_acc: 0.9938
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9929 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9940
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 12/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 13/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0236 - acc: 0.9939 - val_loss: 0.0212 - val_acc: 0.9940
Epoch 14/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 15/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 18/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9938
Epoch 19/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9943 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9943 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9940
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 25/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 27/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 28/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 31/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9949 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 35/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0167 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9944
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9952 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 37/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 38/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9948
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 41/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 43/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 44/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 45/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9950
Epoch 46/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0144 - acc: 0.9959 - val_loss: 0.0159 - val_acc: 0.9950
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9950
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9950
Epoch 49/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 50/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9952
Epoch 51/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9950
Epoch 52/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0150 - acc: 0.9956 - val_loss: 0.0157 - val_acc: 0.9950
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0157 - val_acc: 0.9952
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9952
Epoch 56/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 59/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 60/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9952
Epoch 61/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0149 - acc: 0.9952 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 62/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0154 - acc: 0.9947 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 63/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0144 - acc: 0.9952 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0153 - val_acc: 0.9956
Epoch 66/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 67/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0152 - val_acc: 0.9956
Epoch 68/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 69/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0144 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 70/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9954
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 75/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 76/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0149 - val_acc: 0.9954
Epoch 77/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 79/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0147 - acc: 0.9950 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 80/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0136 - acc: 0.9963 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0149 - val_acc: 0.9954
Epoch 82/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0149 - val_acc: 0.9954
Epoch 00083: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.5 , decay:1e-06
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 57us/step - loss: 0.1729 - acc: 0.9514 - val_loss: 0.0656 - val_acc: 0.9890
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0629 - acc: 0.9868 - val_loss: 0.0445 - val_acc: 0.9924
Epoch 3/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0468 - acc: 0.9890 - val_loss: 0.0364 - val_acc: 0.9924
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0411 - acc: 0.9900 - val_loss: 0.0319 - val_acc: 0.9924
Epoch 5/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0360 - acc: 0.9909 - val_loss: 0.0290 - val_acc: 0.9934
Epoch 6/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0335 - acc: 0.9909 - val_loss: 0.0269 - val_acc: 0.9934
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0306 - acc: 0.9918 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 8/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9919 - val_loss: 0.0240 - val_acc: 0.9940
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0279 - acc: 0.9919 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 10/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0261 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 11/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0243 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9938
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 14/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0233 - acc: 0.9934 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 16/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9944
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 18/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 24/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 25/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 31/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9954 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9952 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0166 - acc: 0.9945 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 37/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 38/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 40/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9948
Epoch 41/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 42/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 43/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0160 - val_acc: 0.9948
Epoch 45/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0160 - val_acc: 0.9948
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 49/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0160 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 50/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 51/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0151 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9951 - val_loss: 0.0159 - val_acc: 0.9950
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 56/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 58/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9949 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 61/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0141 - acc: 0.9953 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 62/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 63/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 65/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 66/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0143 - acc: 0.9957 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 68/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 69/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0152 - val_acc: 0.9950
Epoch 70/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 00070: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.9 , decay:0.0001
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 57us/step - loss: 0.0819 - acc: 0.9723 - val_loss: 0.0275 - val_acc: 0.9936
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9918 - val_loss: 0.0219 - val_acc: 0.9944
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9926 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0212 - acc: 0.9939 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 5/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9958
Epoch 6/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9952
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 8/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9958
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9956 - val_loss: 0.0158 - val_acc: 0.9954
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0155 - val_acc: 0.9956
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0150 - val_acc: 0.9964
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0139 - acc: 0.9951 - val_loss: 0.0149 - val_acc: 0.9962
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0148 - val_acc: 0.9960
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9956
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9958 - val_loss: 0.0148 - val_acc: 0.9960
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0137 - acc: 0.9961 - val_loss: 0.0147 - val_acc: 0.9960
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 21/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0126 - acc: 0.9957 - val_loss: 0.0144 - val_acc: 0.9956
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0130 - acc: 0.9958 - val_loss: 0.0145 - val_acc: 0.9958
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0145 - val_acc: 0.9958
Epoch 24/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0145 - val_acc: 0.9958
Epoch 25/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0130 - acc: 0.9958 - val_loss: 0.0144 - val_acc: 0.9956
Epoch 26/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0122 - acc: 0.9967 - val_loss: 0.0141 - val_acc: 0.9954
Epoch 27/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0117 - acc: 0.9963 - val_loss: 0.0144 - val_acc: 0.9954
Epoch 28/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0114 - acc: 0.9965 - val_loss: 0.0140 - val_acc: 0.9954
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0126 - acc: 0.9961 - val_loss: 0.0139 - val_acc: 0.9952
Epoch 30/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0116 - acc: 0.9967 - val_loss: 0.0140 - val_acc: 0.9956
Epoch 31/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0141 - val_acc: 0.9958
Epoch 32/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0143 - val_acc: 0.9954
Epoch 33/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0114 - acc: 0.9965 - val_loss: 0.0141 - val_acc: 0.9958
Epoch 34/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0111 - acc: 0.9970 - val_loss: 0.0143 - val_acc: 0.9956
Epoch 35/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0111 - acc: 0.9972 - val_loss: 0.0139 - val_acc: 0.9956
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0141 - val_acc: 0.9956
Epoch 00036: early stopping
The model is better than ever!

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.9 , decay:1e-05
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 59us/step - loss: 0.0840 - acc: 0.9703 - val_loss: 0.0288 - val_acc: 0.9914
Epoch 2/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0296 - acc: 0.9913 - val_loss: 0.0228 - val_acc: 0.9932
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9927 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 4/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0187 - val_acc: 0.9954
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 7/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9948
Epoch 12/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9956
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0159 - val_acc: 0.9950
Epoch 14/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0145 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 15/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0156 - val_acc: 0.9950
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0142 - acc: 0.9949 - val_loss: 0.0152 - val_acc: 0.9954
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 18/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0127 - acc: 0.9958 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 20/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0124 - acc: 0.9958 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0125 - acc: 0.9961 - val_loss: 0.0148 - val_acc: 0.9954
Epoch 22/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0148 - val_acc: 0.9954
Epoch 23/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0146 - val_acc: 0.9954
Epoch 25/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0150 - val_acc: 0.9958
Epoch 26/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0112 - acc: 0.9971 - val_loss: 0.0147 - val_acc: 0.9954
Epoch 27/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0123 - acc: 0.9959 - val_loss: 0.0147 - val_acc: 0.9954
Epoch 28/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0114 - acc: 0.9968 - val_loss: 0.0149 - val_acc: 0.9956
Epoch 29/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0116 - acc: 0.9959 - val_loss: 0.0145 - val_acc: 0.9954
Epoch 30/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0112 - acc: 0.9966 - val_loss: 0.0149 - val_acc: 0.9958
Epoch 31/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0106 - acc: 0.9970 - val_loss: 0.0149 - val_acc: 0.9958
Epoch 32/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0106 - acc: 0.9967 - val_loss: 0.0148 - val_acc: 0.9958
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0110 - acc: 0.9964 - val_loss: 0.0146 - val_acc: 0.9956
Epoch 34/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0104 - acc: 0.9967 - val_loss: 0.0144 - val_acc: 0.9956
Epoch 35/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0147 - val_acc: 0.9958
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0104 - acc: 0.9969 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 37/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0104 - acc: 0.9969 - val_loss: 0.0144 - val_acc: 0.9954
Epoch 38/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0102 - acc: 0.9972 - val_loss: 0.0146 - val_acc: 0.9958
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0099 - acc: 0.9969 - val_loss: 0.0144 - val_acc: 0.9956
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0102 - acc: 0.9970 - val_loss: 0.0145 - val_acc: 0.9958
Epoch 41/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0099 - acc: 0.9969 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 42/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0102 - acc: 0.9967 - val_loss: 0.0144 - val_acc: 0.9960
Epoch 43/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0096 - acc: 0.9970 - val_loss: 0.0142 - val_acc: 0.9952
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0100 - acc: 0.9970 - val_loss: 0.0146 - val_acc: 0.9958
Epoch 45/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0107 - acc: 0.9968 - val_loss: 0.0146 - val_acc: 0.9960
Epoch 46/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0099 - acc: 0.9970 - val_loss: 0.0143 - val_acc: 0.9954
Epoch 47/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0094 - acc: 0.9970 - val_loss: 0.0144 - val_acc: 0.9960
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0094 - acc: 0.9973 - val_loss: 0.0141 - val_acc: 0.9952
Epoch 49/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0090 - acc: 0.9973 - val_loss: 0.0145 - val_acc: 0.9960
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0093 - acc: 0.9972 - val_loss: 0.0143 - val_acc: 0.9956
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0098 - acc: 0.9970 - val_loss: 0.0145 - val_acc: 0.9958
Epoch 52/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0092 - acc: 0.9974 - val_loss: 0.0143 - val_acc: 0.9956
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0095 - acc: 0.9968 - val_loss: 0.0142 - val_acc: 0.9952
Epoch 54/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0097 - acc: 0.9970 - val_loss: 0.0142 - val_acc: 0.9954
Epoch 55/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0094 - acc: 0.9969 - val_loss: 0.0143 - val_acc: 0.9954
Epoch 00055: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.001 ,mom:0.9 , decay:1e-06
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 58us/step - loss: 0.0866 - acc: 0.9669 - val_loss: 0.0273 - val_acc: 0.9936
Epoch 2/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0294 - acc: 0.9914 - val_loss: 0.0219 - val_acc: 0.9952
Epoch 3/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 4/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 5/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0192 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 8/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 9/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 10/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 11/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0152 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9954
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0139 - acc: 0.9962 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 14/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9954
Epoch 15/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0151 - val_acc: 0.9960
Epoch 16/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 17/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0151 - val_acc: 0.9960
Epoch 18/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 19/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0130 - acc: 0.9959 - val_loss: 0.0149 - val_acc: 0.9964
Epoch 20/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0150 - val_acc: 0.9954
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0146 - val_acc: 0.9960
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0127 - acc: 0.9963 - val_loss: 0.0146 - val_acc: 0.9960
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0127 - acc: 0.9963 - val_loss: 0.0147 - val_acc: 0.9950
Epoch 25/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 26/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0119 - acc: 0.9962 - val_loss: 0.0146 - val_acc: 0.9954
Epoch 27/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0145 - val_acc: 0.9954
Epoch 28/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0113 - acc: 0.9968 - val_loss: 0.0144 - val_acc: 0.9952
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0112 - acc: 0.9969 - val_loss: 0.0145 - val_acc: 0.9952
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0114 - acc: 0.9966 - val_loss: 0.0146 - val_acc: 0.9952
Epoch 31/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0110 - acc: 0.9965 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0109 - acc: 0.9970 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 33/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0108 - acc: 0.9963 - val_loss: 0.0145 - val_acc: 0.9952
Epoch 34/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0108 - acc: 0.9971 - val_loss: 0.0142 - val_acc: 0.9958
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0106 - acc: 0.9969 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0107 - acc: 0.9967 - val_loss: 0.0143 - val_acc: 0.9952
Epoch 37/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0105 - acc: 0.9968 - val_loss: 0.0144 - val_acc: 0.9952
Epoch 38/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0104 - acc: 0.9970 - val_loss: 0.0146 - val_acc: 0.9952
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0095 - acc: 0.9972 - val_loss: 0.0145 - val_acc: 0.9952
Epoch 40/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0101 - acc: 0.9969 - val_loss: 0.0145 - val_acc: 0.9952
Epoch 41/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0100 - acc: 0.9972 - val_loss: 0.0145 - val_acc: 0.9954
Epoch 00041: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.2 , decay:0.0001
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 60us/step - loss: 0.6236 - acc: 0.6549 - val_loss: 0.4244 - val_acc: 0.8930
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.3899 - acc: 0.8614 - val_loss: 0.2866 - val_acc: 0.9692
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.2865 - acc: 0.9267 - val_loss: 0.2184 - val_acc: 0.9794
Epoch 4/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.2298 - acc: 0.9506 - val_loss: 0.1788 - val_acc: 0.9828
Epoch 5/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1952 - acc: 0.9617 - val_loss: 0.1531 - val_acc: 0.9848
Epoch 6/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1693 - acc: 0.9672 - val_loss: 0.1353 - val_acc: 0.9860
Epoch 7/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1534 - acc: 0.9718 - val_loss: 0.1219 - val_acc: 0.9864
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1384 - acc: 0.9756 - val_loss: 0.1116 - val_acc: 0.9870
Epoch 9/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1271 - acc: 0.9773 - val_loss: 0.1035 - val_acc: 0.9868
Epoch 10/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1195 - acc: 0.9781 - val_loss: 0.0969 - val_acc: 0.9874
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1119 - acc: 0.9796 - val_loss: 0.0914 - val_acc: 0.9874
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1042 - acc: 0.9815 - val_loss: 0.0868 - val_acc: 0.9878
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0997 - acc: 0.9819 - val_loss: 0.0829 - val_acc: 0.9878
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0970 - acc: 0.9819 - val_loss: 0.0794 - val_acc: 0.9880
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0936 - acc: 0.9819 - val_loss: 0.0763 - val_acc: 0.9886
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0886 - acc: 0.9840 - val_loss: 0.0736 - val_acc: 0.9890
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0859 - acc: 0.9839 - val_loss: 0.0712 - val_acc: 0.9890
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0839 - acc: 0.9847 - val_loss: 0.0691 - val_acc: 0.9892
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0805 - acc: 0.9846 - val_loss: 0.0671 - val_acc: 0.9894
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0779 - acc: 0.9856 - val_loss: 0.0653 - val_acc: 0.9894
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0768 - acc: 0.9847 - val_loss: 0.0637 - val_acc: 0.9894
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0742 - acc: 0.9861 - val_loss: 0.0622 - val_acc: 0.9898
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0726 - acc: 0.9850 - val_loss: 0.0609 - val_acc: 0.9900
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0715 - acc: 0.9849 - val_loss: 0.0596 - val_acc: 0.9904
Epoch 25/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0702 - acc: 0.9855 - val_loss: 0.0584 - val_acc: 0.9904
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0700 - acc: 0.9857 - val_loss: 0.0573 - val_acc: 0.9904
Epoch 27/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0663 - acc: 0.9871 - val_loss: 0.0563 - val_acc: 0.9902
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0666 - acc: 0.9859 - val_loss: 0.0554 - val_acc: 0.9902
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0652 - acc: 0.9864 - val_loss: 0.0544 - val_acc: 0.9902
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0628 - acc: 0.9870 - val_loss: 0.0536 - val_acc: 0.9904
Epoch 31/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0631 - acc: 0.9872 - val_loss: 0.0528 - val_acc: 0.9906
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0619 - acc: 0.9865 - val_loss: 0.0520 - val_acc: 0.9906
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0603 - acc: 0.9883 - val_loss: 0.0513 - val_acc: 0.9906
Epoch 34/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0596 - acc: 0.9876 - val_loss: 0.0507 - val_acc: 0.9906
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0592 - acc: 0.9877 - val_loss: 0.0500 - val_acc: 0.9906
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0581 - acc: 0.9868 - val_loss: 0.0494 - val_acc: 0.9906
Epoch 37/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0587 - acc: 0.9879 - val_loss: 0.0489 - val_acc: 0.9908
Epoch 38/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0572 - acc: 0.9886 - val_loss: 0.0483 - val_acc: 0.9906
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0570 - acc: 0.9879 - val_loss: 0.0478 - val_acc: 0.9908
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0556 - acc: 0.9876 - val_loss: 0.0473 - val_acc: 0.9908
Epoch 41/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0552 - acc: 0.9879 - val_loss: 0.0468 - val_acc: 0.9908
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0551 - acc: 0.9882 - val_loss: 0.0463 - val_acc: 0.9910
Epoch 43/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0545 - acc: 0.9879 - val_loss: 0.0459 - val_acc: 0.9910
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0534 - acc: 0.9884 - val_loss: 0.0455 - val_acc: 0.9910
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0529 - acc: 0.9886 - val_loss: 0.0451 - val_acc: 0.9910
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0516 - acc: 0.9894 - val_loss: 0.0447 - val_acc: 0.9910
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0533 - acc: 0.9882 - val_loss: 0.0443 - val_acc: 0.9910
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0526 - acc: 0.9888 - val_loss: 0.0439 - val_acc: 0.9914
Epoch 49/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0519 - acc: 0.9878 - val_loss: 0.0436 - val_acc: 0.9916
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0506 - acc: 0.9889 - val_loss: 0.0432 - val_acc: 0.9916
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0502 - acc: 0.9893 - val_loss: 0.0429 - val_acc: 0.9916
Epoch 52/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0492 - acc: 0.9896 - val_loss: 0.0426 - val_acc: 0.9916
Epoch 53/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0494 - acc: 0.9894 - val_loss: 0.0423 - val_acc: 0.9916
Epoch 54/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0487 - acc: 0.9895 - val_loss: 0.0420 - val_acc: 0.9916
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0493 - acc: 0.9886 - val_loss: 0.0417 - val_acc: 0.9916
Epoch 56/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0482 - acc: 0.9894 - val_loss: 0.0415 - val_acc: 0.9916
Epoch 57/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0493 - acc: 0.9884 - val_loss: 0.0412 - val_acc: 0.9916
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0484 - acc: 0.9890 - val_loss: 0.0409 - val_acc: 0.9916
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0476 - acc: 0.9893 - val_loss: 0.0407 - val_acc: 0.9916
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0478 - acc: 0.9887 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0469 - acc: 0.9897 - val_loss: 0.0402 - val_acc: 0.9916
Epoch 62/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0479 - acc: 0.9887 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 63/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0459 - acc: 0.9899 - val_loss: 0.0397 - val_acc: 0.9918
Epoch 64/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0462 - acc: 0.9896 - val_loss: 0.0395 - val_acc: 0.9918
Epoch 65/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0469 - acc: 0.9890 - val_loss: 0.0393 - val_acc: 0.9918
Epoch 66/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0454 - acc: 0.9899 - val_loss: 0.0391 - val_acc: 0.9918
Epoch 67/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0454 - acc: 0.9894 - val_loss: 0.0389 - val_acc: 0.9918
Epoch 68/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0454 - acc: 0.9895 - val_loss: 0.0387 - val_acc: 0.9918
Epoch 69/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0455 - acc: 0.9897 - val_loss: 0.0385 - val_acc: 0.9918
Epoch 70/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0443 - acc: 0.9901 - val_loss: 0.0383 - val_acc: 0.9918
Epoch 71/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0438 - acc: 0.9897 - val_loss: 0.0382 - val_acc: 0.9918
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0445 - acc: 0.9902 - val_loss: 0.0380 - val_acc: 0.9918
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0438 - acc: 0.9901 - val_loss: 0.0378 - val_acc: 0.9918
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0430 - acc: 0.9903 - val_loss: 0.0376 - val_acc: 0.9922
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0444 - acc: 0.9896 - val_loss: 0.0375 - val_acc: 0.9922
Epoch 76/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0435 - acc: 0.9900 - val_loss: 0.0373 - val_acc: 0.9922
Epoch 77/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0435 - acc: 0.9896 - val_loss: 0.0372 - val_acc: 0.9920
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0432 - acc: 0.9891 - val_loss: 0.0370 - val_acc: 0.9922
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0429 - acc: 0.9903 - val_loss: 0.0369 - val_acc: 0.9922
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0422 - acc: 0.9906 - val_loss: 0.0367 - val_acc: 0.9922
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0415 - acc: 0.9902 - val_loss: 0.0366 - val_acc: 0.9922
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0424 - acc: 0.9901 - val_loss: 0.0364 - val_acc: 0.9922
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0426 - acc: 0.9895 - val_loss: 0.0363 - val_acc: 0.9922
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0410 - acc: 0.9901 - val_loss: 0.0362 - val_acc: 0.9922
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0416 - acc: 0.9905 - val_loss: 0.0360 - val_acc: 0.9924
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0414 - acc: 0.9899 - val_loss: 0.0359 - val_acc: 0.9924
Epoch 87/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0416 - acc: 0.9911 - val_loss: 0.0358 - val_acc: 0.9924
Epoch 88/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0407 - acc: 0.9907 - val_loss: 0.0356 - val_acc: 0.9926
Epoch 89/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0420 - acc: 0.9894 - val_loss: 0.0355 - val_acc: 0.9926
Epoch 90/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0419 - acc: 0.9899 - val_loss: 0.0354 - val_acc: 0.9928
Epoch 91/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0395 - acc: 0.9908 - val_loss: 0.0353 - val_acc: 0.9928
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0405 - acc: 0.9909 - val_loss: 0.0352 - val_acc: 0.9928
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0396 - acc: 0.9920 - val_loss: 0.0351 - val_acc: 0.9928
Epoch 94/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0401 - acc: 0.9909 - val_loss: 0.0349 - val_acc: 0.9928
Epoch 95/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0405 - acc: 0.9901 - val_loss: 0.0348 - val_acc: 0.9928
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0403 - acc: 0.9904 - val_loss: 0.0347 - val_acc: 0.9930
Epoch 97/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0395 - acc: 0.9905 - val_loss: 0.0346 - val_acc: 0.9930
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0399 - acc: 0.9911 - val_loss: 0.0345 - val_acc: 0.9930
Epoch 99/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0400 - acc: 0.9902 - val_loss: 0.0344 - val_acc: 0.9930
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0399 - acc: 0.9905 - val_loss: 0.0343 - val_acc: 0.9930
Epoch 101/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0401 - acc: 0.9899 - val_loss: 0.0342 - val_acc: 0.9930
Epoch 102/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0399 - acc: 0.9906 - val_loss: 0.0341 - val_acc: 0.9930
Epoch 103/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0398 - acc: 0.9901 - val_loss: 0.0340 - val_acc: 0.9930
Epoch 104/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0388 - acc: 0.9906 - val_loss: 0.0339 - val_acc: 0.9930
Epoch 105/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0388 - acc: 0.9916 - val_loss: 0.0338 - val_acc: 0.9930
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0388 - acc: 0.9908 - val_loss: 0.0337 - val_acc: 0.9930
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0380 - acc: 0.9909 - val_loss: 0.0337 - val_acc: 0.9930
Epoch 108/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0387 - acc: 0.9909 - val_loss: 0.0336 - val_acc: 0.9930
Epoch 109/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0374 - acc: 0.9916 - val_loss: 0.0335 - val_acc: 0.9930
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0381 - acc: 0.9913 - val_loss: 0.0334 - val_acc: 0.9930
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0386 - acc: 0.9909 - val_loss: 0.0333 - val_acc: 0.9930
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0375 - acc: 0.9909 - val_loss: 0.0332 - val_acc: 0.9930
Epoch 113/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0380 - acc: 0.9904 - val_loss: 0.0332 - val_acc: 0.9930
Epoch 114/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0379 - acc: 0.9914 - val_loss: 0.0331 - val_acc: 0.9930
Epoch 115/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0383 - acc: 0.9905 - val_loss: 0.0330 - val_acc: 0.9930
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0378 - acc: 0.9913 - val_loss: 0.0329 - val_acc: 0.9930
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0383 - acc: 0.9906 - val_loss: 0.0328 - val_acc: 0.9930
Epoch 118/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0373 - acc: 0.9917 - val_loss: 0.0328 - val_acc: 0.9930
Epoch 119/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0382 - acc: 0.9909 - val_loss: 0.0327 - val_acc: 0.9930
Epoch 120/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0372 - acc: 0.9909 - val_loss: 0.0326 - val_acc: 0.9930
Epoch 121/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0372 - acc: 0.9906 - val_loss: 0.0325 - val_acc: 0.9930
Epoch 122/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0374 - acc: 0.9906 - val_loss: 0.0325 - val_acc: 0.9930
Epoch 123/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0371 - acc: 0.9913 - val_loss: 0.0324 - val_acc: 0.9930
Epoch 124/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0371 - acc: 0.9912 - val_loss: 0.0323 - val_acc: 0.9930
Epoch 125/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0372 - acc: 0.9902 - val_loss: 0.0323 - val_acc: 0.9930
Epoch 126/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0366 - acc: 0.9906 - val_loss: 0.0322 - val_acc: 0.9930
Epoch 127/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0365 - acc: 0.9914 - val_loss: 0.0321 - val_acc: 0.9930
Epoch 128/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0369 - acc: 0.9910 - val_loss: 0.0321 - val_acc: 0.9930
Epoch 129/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0372 - acc: 0.9911 - val_loss: 0.0320 - val_acc: 0.9930
Epoch 130/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0367 - acc: 0.9913 - val_loss: 0.0319 - val_acc: 0.9930
Epoch 131/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0363 - acc: 0.9919 - val_loss: 0.0319 - val_acc: 0.9930
Epoch 132/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0365 - acc: 0.9915 - val_loss: 0.0318 - val_acc: 0.9930
Epoch 133/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0375 - acc: 0.9899 - val_loss: 0.0317 - val_acc: 0.9930
Epoch 134/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0365 - acc: 0.9906 - val_loss: 0.0317 - val_acc: 0.9930
Epoch 135/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0365 - acc: 0.9915 - val_loss: 0.0316 - val_acc: 0.9930
Epoch 136/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0361 - acc: 0.9911 - val_loss: 0.0316 - val_acc: 0.9930
Epoch 137/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0365 - acc: 0.9918 - val_loss: 0.0315 - val_acc: 0.9930
Epoch 138/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0366 - acc: 0.9906 - val_loss: 0.0315 - val_acc: 0.9930
Epoch 139/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0359 - acc: 0.9913 - val_loss: 0.0314 - val_acc: 0.9930
Epoch 140/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0355 - acc: 0.9913 - val_loss: 0.0313 - val_acc: 0.9930
Epoch 141/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0357 - acc: 0.9918 - val_loss: 0.0313 - val_acc: 0.9930
Epoch 142/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0357 - acc: 0.9906 - val_loss: 0.0312 - val_acc: 0.9930
Epoch 143/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0355 - acc: 0.9917 - val_loss: 0.0312 - val_acc: 0.9930
Epoch 144/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0359 - acc: 0.9902 - val_loss: 0.0311 - val_acc: 0.9930
Epoch 145/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0355 - acc: 0.9914 - val_loss: 0.0311 - val_acc: 0.9930
Epoch 146/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0353 - acc: 0.9914 - val_loss: 0.0310 - val_acc: 0.9930
Epoch 147/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0356 - acc: 0.9919 - val_loss: 0.0310 - val_acc: 0.9930
Epoch 148/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0348 - acc: 0.9917 - val_loss: 0.0309 - val_acc: 0.9930
Epoch 149/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0358 - acc: 0.9907 - val_loss: 0.0309 - val_acc: 0.9930
Epoch 150/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0352 - acc: 0.9911 - val_loss: 0.0308 - val_acc: 0.9930
Epoch 151/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0351 - acc: 0.9913 - val_loss: 0.0308 - val_acc: 0.9930
Epoch 152/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0357 - acc: 0.9909 - val_loss: 0.0307 - val_acc: 0.9930
Epoch 153/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0350 - acc: 0.9914 - val_loss: 0.0307 - val_acc: 0.9930
Epoch 154/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0345 - acc: 0.9915 - val_loss: 0.0306 - val_acc: 0.9930
Epoch 155/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0343 - acc: 0.9918 - val_loss: 0.0306 - val_acc: 0.9930
Epoch 156/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0355 - acc: 0.9906 - val_loss: 0.0305 - val_acc: 0.9930
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0351 - acc: 0.9905 - val_loss: 0.0305 - val_acc: 0.9930
Epoch 158/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0337 - acc: 0.9924 - val_loss: 0.0305 - val_acc: 0.9930
Epoch 159/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0352 - acc: 0.9917 - val_loss: 0.0304 - val_acc: 0.9930
Epoch 160/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0338 - acc: 0.9920 - val_loss: 0.0304 - val_acc: 0.9930
Epoch 161/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0353 - acc: 0.9909 - val_loss: 0.0303 - val_acc: 0.9930
Epoch 162/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0335 - acc: 0.9922 - val_loss: 0.0303 - val_acc: 0.9930
Epoch 163/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0350 - acc: 0.9914 - val_loss: 0.0302 - val_acc: 0.9930
Epoch 164/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0345 - acc: 0.9911 - val_loss: 0.0302 - val_acc: 0.9930
Epoch 165/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0350 - acc: 0.9908 - val_loss: 0.0301 - val_acc: 0.9930
Epoch 166/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0345 - acc: 0.9916 - val_loss: 0.0301 - val_acc: 0.9930
Epoch 167/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0336 - acc: 0.9912 - val_loss: 0.0301 - val_acc: 0.9930
Epoch 168/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0341 - acc: 0.9910 - val_loss: 0.0300 - val_acc: 0.9930
Epoch 169/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0346 - acc: 0.9911 - val_loss: 0.0300 - val_acc: 0.9930
Epoch 170/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0341 - acc: 0.9913 - val_loss: 0.0299 - val_acc: 0.9930
Epoch 171/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0339 - acc: 0.9913 - val_loss: 0.0299 - val_acc: 0.9930
Epoch 172/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0333 - acc: 0.9919 - val_loss: 0.0299 - val_acc: 0.9930
Epoch 173/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0328 - acc: 0.9919 - val_loss: 0.0298 - val_acc: 0.9930
Epoch 174/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0337 - acc: 0.9910 - val_loss: 0.0298 - val_acc: 0.9930
Epoch 175/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0341 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9930
Epoch 176/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0336 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9930
Epoch 177/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0342 - acc: 0.9911 - val_loss: 0.0297 - val_acc: 0.9930
Epoch 178/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0335 - acc: 0.9918 - val_loss: 0.0296 - val_acc: 0.9930
Epoch 179/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0340 - acc: 0.9910 - val_loss: 0.0296 - val_acc: 0.9930
Epoch 180/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0333 - acc: 0.9913 - val_loss: 0.0296 - val_acc: 0.9930
Epoch 181/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0331 - acc: 0.9920 - val_loss: 0.0295 - val_acc: 0.9930
Epoch 182/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0348 - acc: 0.9905 - val_loss: 0.0295 - val_acc: 0.9930
Epoch 183/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9930
Epoch 184/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0342 - acc: 0.9910 - val_loss: 0.0294 - val_acc: 0.9930
Epoch 185/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0327 - acc: 0.9923 - val_loss: 0.0294 - val_acc: 0.9930
Epoch 186/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0340 - acc: 0.9914 - val_loss: 0.0293 - val_acc: 0.9930
Epoch 187/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0325 - acc: 0.9919 - val_loss: 0.0293 - val_acc: 0.9930
Epoch 188/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9920 - val_loss: 0.0293 - val_acc: 0.9930
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9917 - val_loss: 0.0292 - val_acc: 0.9930
Epoch 190/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9914 - val_loss: 0.0292 - val_acc: 0.9930
Epoch 191/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0334 - acc: 0.9911 - val_loss: 0.0292 - val_acc: 0.9930
Epoch 192/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0340 - acc: 0.9910 - val_loss: 0.0291 - val_acc: 0.9930
Epoch 193/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0324 - acc: 0.9915 - val_loss: 0.0291 - val_acc: 0.9930
Epoch 194/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0327 - acc: 0.9920 - val_loss: 0.0291 - val_acc: 0.9930
Epoch 195/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0325 - acc: 0.9918 - val_loss: 0.0290 - val_acc: 0.9930
Epoch 196/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0327 - acc: 0.9920 - val_loss: 0.0290 - val_acc: 0.9930
Epoch 197/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0329 - acc: 0.9919 - val_loss: 0.0290 - val_acc: 0.9930
Epoch 198/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0337 - acc: 0.9914 - val_loss: 0.0289 - val_acc: 0.9930
Epoch 199/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0329 - acc: 0.9918 - val_loss: 0.0289 - val_acc: 0.9930
Epoch 200/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0325 - acc: 0.9922 - val_loss: 0.0289 - val_acc: 0.9930
Epoch 201/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0318 - acc: 0.9918 - val_loss: 0.0289 - val_acc: 0.9930
Epoch 202/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0331 - acc: 0.9910 - val_loss: 0.0288 - val_acc: 0.9930
Epoch 203/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0334 - acc: 0.9915 - val_loss: 0.0288 - val_acc: 0.9930
Epoch 204/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0329 - acc: 0.9918 - val_loss: 0.0288 - val_acc: 0.9930
Epoch 205/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0328 - acc: 0.9914 - val_loss: 0.0287 - val_acc: 0.9930
Epoch 206/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0323 - acc: 0.9917 - val_loss: 0.0287 - val_acc: 0.9930
Epoch 207/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0331 - acc: 0.9915 - val_loss: 0.0287 - val_acc: 0.9930
Epoch 208/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0326 - acc: 0.9915 - val_loss: 0.0286 - val_acc: 0.9930
Epoch 209/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0328 - acc: 0.9913 - val_loss: 0.0286 - val_acc: 0.9930
Epoch 210/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0311 - acc: 0.9923 - val_loss: 0.0286 - val_acc: 0.9930
Epoch 211/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0323 - acc: 0.9922 - val_loss: 0.0286 - val_acc: 0.9930
Epoch 212/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0318 - acc: 0.9927 - val_loss: 0.0285 - val_acc: 0.9930
Epoch 213/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0320 - acc: 0.9921 - val_loss: 0.0285 - val_acc: 0.9930
Epoch 214/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0318 - acc: 0.9915 - val_loss: 0.0285 - val_acc: 0.9930
Epoch 215/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0318 - acc: 0.9920 - val_loss: 0.0285 - val_acc: 0.9930
Epoch 216/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0326 - acc: 0.9920 - val_loss: 0.0284 - val_acc: 0.9930
Epoch 217/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0321 - acc: 0.9920 - val_loss: 0.0284 - val_acc: 0.9930
Epoch 218/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0322 - acc: 0.9921 - val_loss: 0.0284 - val_acc: 0.9930
Epoch 219/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0325 - acc: 0.9914 - val_loss: 0.0284 - val_acc: 0.9930
Epoch 220/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0329 - acc: 0.9905 - val_loss: 0.0283 - val_acc: 0.9930
Epoch 221/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0317 - acc: 0.9918 - val_loss: 0.0283 - val_acc: 0.9930
Epoch 222/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9926 - val_loss: 0.0283 - val_acc: 0.9930
Epoch 223/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0333 - acc: 0.9911 - val_loss: 0.0282 - val_acc: 0.9930
Epoch 224/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0316 - acc: 0.9921 - val_loss: 0.0282 - val_acc: 0.9930
Epoch 225/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0318 - acc: 0.9917 - val_loss: 0.0282 - val_acc: 0.9930
Epoch 226/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0320 - acc: 0.9918 - val_loss: 0.0282 - val_acc: 0.9930
Epoch 227/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0320 - acc: 0.9919 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 228/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0320 - acc: 0.9921 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 229/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0318 - acc: 0.9920 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 230/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0316 - acc: 0.9919 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 231/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0317 - acc: 0.9919 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 232/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0315 - acc: 0.9919 - val_loss: 0.0280 - val_acc: 0.9930
Epoch 233/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0312 - acc: 0.9923 - val_loss: 0.0280 - val_acc: 0.9930
Epoch 234/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0312 - acc: 0.9921 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 235/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0319 - acc: 0.9911 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 236/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0311 - acc: 0.9925 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 237/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0310 - acc: 0.9916 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 238/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0312 - acc: 0.9922 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 239/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0319 - acc: 0.9914 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 240/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0303 - acc: 0.9927 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 241/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0308 - acc: 0.9920 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 242/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0309 - acc: 0.9918 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 243/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0312 - acc: 0.9917 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 244/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0315 - acc: 0.9923 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 245/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0318 - acc: 0.9920 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 246/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0312 - acc: 0.9922 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 247/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0316 - acc: 0.9914 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 248/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 249/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0315 - acc: 0.9917 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 250/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0311 - acc: 0.9920 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 251/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9925 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 252/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0317 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 253/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0316 - acc: 0.9922 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 254/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0306 - acc: 0.9924 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 255/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0309 - acc: 0.9910 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 256/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0310 - acc: 0.9922 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 257/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9920 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 258/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0312 - acc: 0.9917 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 259/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0306 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 260/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0306 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 261/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0314 - acc: 0.9913 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 262/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0309 - acc: 0.9923 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 263/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0311 - acc: 0.9919 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 264/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0307 - acc: 0.9919 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 265/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0304 - acc: 0.9920 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 266/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0305 - acc: 0.9925 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 267/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0311 - acc: 0.9921 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 268/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0301 - acc: 0.9924 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 269/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0314 - acc: 0.9921 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 270/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9920 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 271/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0309 - acc: 0.9918 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 272/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9920 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 273/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0297 - acc: 0.9924 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 274/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0299 - acc: 0.9927 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 275/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9924 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 276/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0308 - acc: 0.9920 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 277/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0315 - acc: 0.9914 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 278/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9924 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 279/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9923 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 280/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0310 - acc: 0.9922 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 281/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9920 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 282/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0300 - acc: 0.9922 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 283/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0300 - acc: 0.9927 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 284/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0299 - acc: 0.9919 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 285/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9926 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 286/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9920 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 287/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 288/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0307 - acc: 0.9928 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 289/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0305 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 290/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9920 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 291/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0305 - acc: 0.9916 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 292/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0297 - acc: 0.9931 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 293/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0295 - acc: 0.9924 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 294/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9924 - val_loss: 0.0268 - val_acc: 0.9934
Epoch 295/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0296 - acc: 0.9926 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 296/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9920 - val_loss: 0.0268 - val_acc: 0.9934
Epoch 297/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0295 - acc: 0.9923 - val_loss: 0.0268 - val_acc: 0.9934
Epoch 298/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9925 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 299/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0304 - acc: 0.9920 - val_loss: 0.0267 - val_acc: 0.9934
Epoch 300/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0299 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9932
Epoch 301/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0297 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9934
Epoch 302/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0295 - acc: 0.9927 - val_loss: 0.0267 - val_acc: 0.9934
Epoch 303/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0301 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9934
Epoch 304/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0301 - acc: 0.9923 - val_loss: 0.0267 - val_acc: 0.9934
Epoch 305/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0294 - acc: 0.9920 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 306/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0298 - acc: 0.9926 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 307/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0294 - acc: 0.9919 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 308/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0307 - acc: 0.9919 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 309/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 310/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0299 - acc: 0.9919 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 311/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0307 - acc: 0.9916 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 312/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9916 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 313/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9922 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 314/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9919 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 315/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0291 - acc: 0.9929 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 316/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9922 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 317/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0299 - acc: 0.9925 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 318/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9917 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 319/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0294 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 320/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0301 - acc: 0.9920 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 321/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 322/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0301 - acc: 0.9918 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 323/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0295 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 324/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0293 - acc: 0.9926 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 325/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 326/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9921 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 327/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9916 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 328/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0296 - acc: 0.9922 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 329/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9922 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 330/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9920 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 331/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0304 - acc: 0.9917 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 332/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9919 - val_loss: 0.0263 - val_acc: 0.9934
Epoch 333/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9922 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 334/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9926 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 335/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0298 - acc: 0.9914 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 336/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0307 - acc: 0.9915 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 337/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9930 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 338/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9926 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 339/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9921 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 340/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0292 - acc: 0.9918 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 341/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0294 - acc: 0.9921 - val_loss: 0.0262 - val_acc: 0.9934
Epoch 342/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0293 - acc: 0.9919 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 343/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9926 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 344/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0294 - acc: 0.9919 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 345/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9928 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 346/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9928 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 347/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 348/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9921 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 349/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0291 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9934
Epoch 350/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0287 - acc: 0.9921 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 351/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0299 - acc: 0.9916 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 352/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0296 - acc: 0.9924 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 353/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0289 - acc: 0.9919 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 354/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9918 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 355/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0298 - acc: 0.9924 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 356/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0290 - acc: 0.9923 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 357/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0260 - val_acc: 0.9934
Epoch 358/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0280 - acc: 0.9926 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 359/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9918 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 360/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0295 - acc: 0.9919 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 361/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0279 - acc: 0.9928 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 362/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0275 - acc: 0.9926 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 363/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0289 - acc: 0.9920 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 364/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0287 - acc: 0.9923 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 365/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9924 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 366/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0281 - acc: 0.9929 - val_loss: 0.0259 - val_acc: 0.9934
Epoch 367/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0287 - acc: 0.9918 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 368/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0284 - acc: 0.9924 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 369/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0292 - acc: 0.9920 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 370/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0288 - acc: 0.9922 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 371/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0291 - acc: 0.9926 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 372/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0278 - acc: 0.9930 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 373/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9927 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 374/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0281 - acc: 0.9927 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 375/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0258 - val_acc: 0.9934
Epoch 376/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0285 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 377/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0295 - acc: 0.9913 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 378/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 379/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9925 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 380/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 381/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 382/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0280 - acc: 0.9925 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 383/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 384/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0287 - acc: 0.9914 - val_loss: 0.0257 - val_acc: 0.9934
Epoch 385/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9923 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 386/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 387/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9924 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 388/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 389/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0286 - acc: 0.9926 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 390/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0280 - acc: 0.9924 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 391/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 392/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0279 - acc: 0.9928 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 393/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0283 - acc: 0.9929 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 394/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9925 - val_loss: 0.0256 - val_acc: 0.9934
Epoch 395/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0293 - acc: 0.9915 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 396/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0286 - acc: 0.9922 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 397/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 398/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0285 - acc: 0.9923 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 399/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 400/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9923 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 401/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9923 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 402/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0282 - acc: 0.9924 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 403/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0286 - acc: 0.9927 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 404/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0282 - acc: 0.9926 - val_loss: 0.0255 - val_acc: 0.9934
Epoch 405/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0276 - acc: 0.9932 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 406/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 407/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9920 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 408/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0277 - acc: 0.9926 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 409/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9918 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 410/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0273 - acc: 0.9931 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 411/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9924 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 412/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0283 - acc: 0.9926 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 413/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 414/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0254 - val_acc: 0.9934
Epoch 415/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9932 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 416/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0277 - acc: 0.9929 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 417/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0276 - acc: 0.9924 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 418/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9929 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 419/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0285 - acc: 0.9924 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 420/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0279 - acc: 0.9923 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 421/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9920 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 422/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0283 - acc: 0.9923 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 423/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 424/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0284 - acc: 0.9918 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 425/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9925 - val_loss: 0.0253 - val_acc: 0.9934
Epoch 426/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0280 - acc: 0.9919 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 427/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0286 - acc: 0.9923 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 428/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9922 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 429/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 430/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9926 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 431/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0276 - acc: 0.9922 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 432/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9918 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 433/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 434/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0285 - acc: 0.9925 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 435/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0280 - acc: 0.9928 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 436/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 437/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9926 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 438/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9930 - val_loss: 0.0252 - val_acc: 0.9934
Epoch 439/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9927 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 440/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0291 - acc: 0.9917 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 441/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0276 - acc: 0.9924 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 442/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0278 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 443/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9929 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 444/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9924 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 445/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 446/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 447/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0283 - acc: 0.9926 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 448/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9926 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 449/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 450/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9932 - val_loss: 0.0251 - val_acc: 0.9934
Epoch 451/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9927 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 452/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9923 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 453/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0276 - acc: 0.9921 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 454/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0278 - acc: 0.9926 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 455/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 456/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0284 - acc: 0.9926 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 457/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9928 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 458/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0273 - acc: 0.9924 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 459/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0276 - acc: 0.9923 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 460/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9927 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 461/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0268 - acc: 0.9929 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 462/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 463/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0279 - acc: 0.9923 - val_loss: 0.0250 - val_acc: 0.9934
Epoch 464/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0276 - acc: 0.9929 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 465/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9921 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 466/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0279 - acc: 0.9924 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 467/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0281 - acc: 0.9921 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 468/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0282 - acc: 0.9923 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 469/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0269 - acc: 0.9932 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 470/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0275 - acc: 0.9919 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 471/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0281 - acc: 0.9919 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 472/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 473/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0281 - acc: 0.9919 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 474/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0283 - acc: 0.9927 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 475/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0276 - acc: 0.9923 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 476/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0276 - acc: 0.9926 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 477/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 478/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9933 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 479/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0272 - acc: 0.9923 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 480/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 481/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0273 - acc: 0.9924 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 482/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0269 - acc: 0.9933 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 483/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9927 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 484/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0274 - acc: 0.9925 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 485/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0273 - acc: 0.9928 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 486/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0275 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 487/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0271 - acc: 0.9926 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 488/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0277 - acc: 0.9919 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 489/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0275 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 490/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9928 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 491/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 492/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0277 - acc: 0.9923 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 493/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 494/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0268 - acc: 0.9930 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 495/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0274 - acc: 0.9934 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 496/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9924 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 497/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9919 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 498/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 499/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0274 - acc: 0.9924 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 500/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0272 - acc: 0.9926 - val_loss: 0.0247 - val_acc: 0.9934
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.2 , decay:1e-05
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 62us/step - loss: 0.6342 - acc: 0.6442 - val_loss: 0.4194 - val_acc: 0.9056
Epoch 2/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.3884 - acc: 0.8659 - val_loss: 0.2766 - val_acc: 0.9744
Epoch 3/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.2811 - acc: 0.9323 - val_loss: 0.2068 - val_acc: 0.9844
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.2229 - acc: 0.9561 - val_loss: 0.1669 - val_acc: 0.9854
Epoch 5/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1845 - acc: 0.9670 - val_loss: 0.1414 - val_acc: 0.9862
Epoch 6/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.1605 - acc: 0.9728 - val_loss: 0.1238 - val_acc: 0.9866
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1439 - acc: 0.9754 - val_loss: 0.1109 - val_acc: 0.9868
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1300 - acc: 0.9784 - val_loss: 0.1009 - val_acc: 0.9880
Epoch 9/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.1189 - acc: 0.9797 - val_loss: 0.0930 - val_acc: 0.9888
Epoch 10/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.1103 - acc: 0.9801 - val_loss: 0.0867 - val_acc: 0.9888
Epoch 11/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.1043 - acc: 0.9820 - val_loss: 0.0814 - val_acc: 0.9892
Epoch 12/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0987 - acc: 0.9829 - val_loss: 0.0768 - val_acc: 0.9898
Epoch 13/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0925 - acc: 0.9828 - val_loss: 0.0730 - val_acc: 0.9902
Epoch 14/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0872 - acc: 0.9849 - val_loss: 0.0697 - val_acc: 0.9906
Epoch 15/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0856 - acc: 0.9849 - val_loss: 0.0667 - val_acc: 0.9906
Epoch 16/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0812 - acc: 0.9860 - val_loss: 0.0642 - val_acc: 0.9908
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0796 - acc: 0.9838 - val_loss: 0.0618 - val_acc: 0.9908
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0755 - acc: 0.9852 - val_loss: 0.0597 - val_acc: 0.9910
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0722 - acc: 0.9856 - val_loss: 0.0579 - val_acc: 0.9910
Epoch 20/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0708 - acc: 0.9864 - val_loss: 0.0562 - val_acc: 0.9914
Epoch 21/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0696 - acc: 0.9860 - val_loss: 0.0546 - val_acc: 0.9914
Epoch 22/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0680 - acc: 0.9859 - val_loss: 0.0532 - val_acc: 0.9914
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0651 - acc: 0.9872 - val_loss: 0.0519 - val_acc: 0.9914
Epoch 24/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0629 - acc: 0.9868 - val_loss: 0.0506 - val_acc: 0.9914
Epoch 25/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0614 - acc: 0.9865 - val_loss: 0.0495 - val_acc: 0.9914
Epoch 26/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0609 - acc: 0.9877 - val_loss: 0.0485 - val_acc: 0.9916
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0600 - acc: 0.9874 - val_loss: 0.0475 - val_acc: 0.9916
Epoch 28/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0570 - acc: 0.9878 - val_loss: 0.0466 - val_acc: 0.9916
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0566 - acc: 0.9879 - val_loss: 0.0457 - val_acc: 0.9916
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0565 - acc: 0.9878 - val_loss: 0.0449 - val_acc: 0.9916
Epoch 31/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0556 - acc: 0.9882 - val_loss: 0.0441 - val_acc: 0.9916
Epoch 32/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0542 - acc: 0.9886 - val_loss: 0.0434 - val_acc: 0.9918
Epoch 33/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0530 - acc: 0.9892 - val_loss: 0.0427 - val_acc: 0.9920
Epoch 34/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0527 - acc: 0.9883 - val_loss: 0.0421 - val_acc: 0.9922
Epoch 35/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0524 - acc: 0.9885 - val_loss: 0.0414 - val_acc: 0.9924
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0510 - acc: 0.9887 - val_loss: 0.0408 - val_acc: 0.9924
Epoch 37/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0504 - acc: 0.9886 - val_loss: 0.0403 - val_acc: 0.9924
Epoch 38/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0490 - acc: 0.9887 - val_loss: 0.0397 - val_acc: 0.9924
Epoch 39/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0484 - acc: 0.9901 - val_loss: 0.0392 - val_acc: 0.9926
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0478 - acc: 0.9890 - val_loss: 0.0387 - val_acc: 0.9930
Epoch 41/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0477 - acc: 0.9899 - val_loss: 0.0383 - val_acc: 0.9926
Epoch 42/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0487 - acc: 0.9890 - val_loss: 0.0378 - val_acc: 0.9928
Epoch 43/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0453 - acc: 0.9902 - val_loss: 0.0374 - val_acc: 0.9926
Epoch 44/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0473 - acc: 0.9883 - val_loss: 0.0370 - val_acc: 0.9928
Epoch 45/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0463 - acc: 0.9888 - val_loss: 0.0366 - val_acc: 0.9928
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0452 - acc: 0.9896 - val_loss: 0.0362 - val_acc: 0.9928
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0444 - acc: 0.9898 - val_loss: 0.0359 - val_acc: 0.9928
Epoch 48/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0438 - acc: 0.9896 - val_loss: 0.0355 - val_acc: 0.9930
Epoch 49/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0431 - acc: 0.9899 - val_loss: 0.0352 - val_acc: 0.9930
Epoch 50/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0441 - acc: 0.9896 - val_loss: 0.0348 - val_acc: 0.9932
Epoch 51/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0430 - acc: 0.9903 - val_loss: 0.0345 - val_acc: 0.9932
Epoch 52/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0432 - acc: 0.9902 - val_loss: 0.0342 - val_acc: 0.9932
Epoch 53/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0421 - acc: 0.9899 - val_loss: 0.0339 - val_acc: 0.9932
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0422 - acc: 0.9899 - val_loss: 0.0336 - val_acc: 0.9934
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0415 - acc: 0.9901 - val_loss: 0.0333 - val_acc: 0.9934
Epoch 56/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0406 - acc: 0.9911 - val_loss: 0.0330 - val_acc: 0.9934
Epoch 57/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0404 - acc: 0.9901 - val_loss: 0.0328 - val_acc: 0.9934
Epoch 58/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0406 - acc: 0.9895 - val_loss: 0.0325 - val_acc: 0.9934
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0397 - acc: 0.9902 - val_loss: 0.0323 - val_acc: 0.9934
Epoch 60/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0389 - acc: 0.9905 - val_loss: 0.0320 - val_acc: 0.9934
Epoch 61/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0388 - acc: 0.9901 - val_loss: 0.0318 - val_acc: 0.9934
Epoch 62/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0407 - acc: 0.9896 - val_loss: 0.0316 - val_acc: 0.9934
Epoch 63/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0396 - acc: 0.9901 - val_loss: 0.0314 - val_acc: 0.9934
Epoch 64/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0398 - acc: 0.9901 - val_loss: 0.0311 - val_acc: 0.9934
Epoch 65/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0389 - acc: 0.9901 - val_loss: 0.0309 - val_acc: 0.9934
Epoch 66/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0381 - acc: 0.9907 - val_loss: 0.0307 - val_acc: 0.9934
Epoch 67/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0383 - acc: 0.9909 - val_loss: 0.0305 - val_acc: 0.9934
Epoch 68/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0376 - acc: 0.9903 - val_loss: 0.0303 - val_acc: 0.9934
Epoch 69/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0383 - acc: 0.9909 - val_loss: 0.0302 - val_acc: 0.9936
Epoch 70/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0369 - acc: 0.9911 - val_loss: 0.0300 - val_acc: 0.9936
Epoch 71/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0364 - acc: 0.9911 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 72/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0375 - acc: 0.9909 - val_loss: 0.0296 - val_acc: 0.9934
Epoch 73/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0363 - acc: 0.9915 - val_loss: 0.0294 - val_acc: 0.9936
Epoch 74/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0364 - acc: 0.9910 - val_loss: 0.0293 - val_acc: 0.9936
Epoch 75/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0353 - acc: 0.9910 - val_loss: 0.0291 - val_acc: 0.9934
Epoch 76/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0350 - acc: 0.9917 - val_loss: 0.0290 - val_acc: 0.9936
Epoch 77/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0356 - acc: 0.9906 - val_loss: 0.0288 - val_acc: 0.9936
Epoch 78/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0350 - acc: 0.9917 - val_loss: 0.0286 - val_acc: 0.9936
Epoch 79/500
20000/20000 [==============================] - 1s 59us/step - loss: 0.0347 - acc: 0.9914 - val_loss: 0.0285 - val_acc: 0.9936
Epoch 80/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0348 - acc: 0.9914 - val_loss: 0.0284 - val_acc: 0.9938
Epoch 81/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0340 - acc: 0.9919 - val_loss: 0.0282 - val_acc: 0.9938
Epoch 82/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0352 - acc: 0.9909 - val_loss: 0.0281 - val_acc: 0.9938
Epoch 83/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0336 - acc: 0.9912 - val_loss: 0.0279 - val_acc: 0.9938
Epoch 84/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0343 - acc: 0.9911 - val_loss: 0.0278 - val_acc: 0.9938
Epoch 85/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0339 - acc: 0.9915 - val_loss: 0.0277 - val_acc: 0.9938
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0345 - acc: 0.9912 - val_loss: 0.0275 - val_acc: 0.9938
Epoch 87/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0342 - acc: 0.9916 - val_loss: 0.0274 - val_acc: 0.9938
Epoch 88/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0330 - acc: 0.9920 - val_loss: 0.0273 - val_acc: 0.9938
Epoch 89/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0338 - acc: 0.9907 - val_loss: 0.0272 - val_acc: 0.9938
Epoch 90/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0327 - acc: 0.9919 - val_loss: 0.0271 - val_acc: 0.9938
Epoch 91/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0332 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9938
Epoch 92/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0323 - acc: 0.9917 - val_loss: 0.0268 - val_acc: 0.9938
Epoch 93/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0324 - acc: 0.9918 - val_loss: 0.0267 - val_acc: 0.9938
Epoch 94/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0326 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9940
Epoch 95/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0326 - acc: 0.9910 - val_loss: 0.0265 - val_acc: 0.9940
Epoch 96/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0317 - acc: 0.9918 - val_loss: 0.0264 - val_acc: 0.9940
Epoch 97/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0263 - val_acc: 0.9940
Epoch 98/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0329 - acc: 0.9911 - val_loss: 0.0262 - val_acc: 0.9940
Epoch 99/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0319 - acc: 0.9919 - val_loss: 0.0261 - val_acc: 0.9940
Epoch 100/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0311 - acc: 0.9920 - val_loss: 0.0260 - val_acc: 0.9940
Epoch 101/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0308 - acc: 0.9919 - val_loss: 0.0259 - val_acc: 0.9940
Epoch 102/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0309 - acc: 0.9922 - val_loss: 0.0258 - val_acc: 0.9940
Epoch 103/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0319 - acc: 0.9915 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 104/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0317 - acc: 0.9914 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 105/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0317 - acc: 0.9914 - val_loss: 0.0256 - val_acc: 0.9940
Epoch 106/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0322 - acc: 0.9915 - val_loss: 0.0254 - val_acc: 0.9942
Epoch 107/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0320 - acc: 0.9920 - val_loss: 0.0254 - val_acc: 0.9942
Epoch 108/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0312 - acc: 0.9919 - val_loss: 0.0253 - val_acc: 0.9940
Epoch 109/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0304 - acc: 0.9920 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 110/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0300 - acc: 0.9919 - val_loss: 0.0251 - val_acc: 0.9942
Epoch 111/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0312 - acc: 0.9918 - val_loss: 0.0250 - val_acc: 0.9942
Epoch 112/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0300 - acc: 0.9925 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 113/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0305 - acc: 0.9920 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 114/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0305 - acc: 0.9917 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 115/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0301 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9942
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9930 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9915 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 118/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0294 - acc: 0.9920 - val_loss: 0.0245 - val_acc: 0.9942
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9921 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 120/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0300 - acc: 0.9917 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 121/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0296 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9942
Epoch 122/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 123/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9922 - val_loss: 0.0241 - val_acc: 0.9942
Epoch 124/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0291 - acc: 0.9921 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 125/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0295 - acc: 0.9917 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 126/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0289 - acc: 0.9923 - val_loss: 0.0239 - val_acc: 0.9942
Epoch 127/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0283 - acc: 0.9924 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 128/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0286 - acc: 0.9930 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 129/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 130/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0294 - acc: 0.9918 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 131/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0284 - acc: 0.9917 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 132/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0296 - acc: 0.9916 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 133/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0299 - acc: 0.9912 - val_loss: 0.0235 - val_acc: 0.9946
Epoch 134/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0278 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 135/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0287 - acc: 0.9920 - val_loss: 0.0234 - val_acc: 0.9946
Epoch 136/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0283 - acc: 0.9916 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 137/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0276 - acc: 0.9932 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 138/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0273 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 139/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0277 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 140/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 141/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0272 - acc: 0.9923 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 142/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0288 - acc: 0.9921 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 143/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 144/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9919 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 145/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0276 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 146/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0273 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 147/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0274 - acc: 0.9927 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 148/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9921 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 149/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0269 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 150/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0281 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9948
Epoch 151/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0274 - acc: 0.9925 - val_loss: 0.0226 - val_acc: 0.9948
Epoch 152/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0272 - acc: 0.9924 - val_loss: 0.0225 - val_acc: 0.9948
Epoch 153/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0269 - acc: 0.9924 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 154/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0270 - acc: 0.9922 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 155/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 156/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0267 - acc: 0.9924 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 157/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0264 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 158/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0263 - acc: 0.9930 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 159/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0279 - acc: 0.9923 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 160/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0265 - acc: 0.9924 - val_loss: 0.0221 - val_acc: 0.9950
Epoch 161/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9920 - val_loss: 0.0221 - val_acc: 0.9950
Epoch 162/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0265 - acc: 0.9926 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 163/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0262 - acc: 0.9926 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 164/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0258 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 165/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 166/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9921 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 167/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0259 - acc: 0.9924 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 168/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9950
Epoch 169/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0264 - acc: 0.9923 - val_loss: 0.0218 - val_acc: 0.9950
Epoch 170/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0263 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9950
Epoch 171/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0262 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9950
Epoch 172/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9950
Epoch 173/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0259 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9950
Epoch 174/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 175/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0261 - acc: 0.9925 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 176/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0260 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 177/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 178/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 179/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0259 - acc: 0.9922 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 180/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 181/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 182/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0259 - acc: 0.9922 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 183/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0249 - acc: 0.9930 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 184/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0260 - acc: 0.9923 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 185/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0259 - acc: 0.9921 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 186/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 187/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0250 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 188/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 189/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0257 - acc: 0.9925 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 190/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 191/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9948
Epoch 192/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0210 - val_acc: 0.9948
Epoch 193/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0250 - acc: 0.9925 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 194/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0252 - acc: 0.9924 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 195/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0253 - acc: 0.9926 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 196/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0256 - acc: 0.9927 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 197/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 198/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 199/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0245 - acc: 0.9933 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 200/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0250 - acc: 0.9928 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 201/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 202/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 203/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 204/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0206 - val_acc: 0.9950
Epoch 205/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9950
Epoch 206/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9933 - val_loss: 0.0206 - val_acc: 0.9950
Epoch 207/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9950
Epoch 208/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0205 - val_acc: 0.9950
Epoch 209/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0205 - val_acc: 0.9950
Epoch 210/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0205 - val_acc: 0.9950
Epoch 211/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0241 - acc: 0.9936 - val_loss: 0.0204 - val_acc: 0.9950
Epoch 212/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0246 - acc: 0.9924 - val_loss: 0.0204 - val_acc: 0.9950
Epoch 213/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0204 - val_acc: 0.9950
Epoch 214/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0204 - val_acc: 0.9950
Epoch 215/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0252 - acc: 0.9922 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 216/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 217/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 218/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 219/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0247 - acc: 0.9928 - val_loss: 0.0202 - val_acc: 0.9948
Epoch 220/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0235 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9948
Epoch 221/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0202 - val_acc: 0.9948
Epoch 222/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9930 - val_loss: 0.0202 - val_acc: 0.9948
Epoch 223/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9948
Epoch 224/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0235 - acc: 0.9930 - val_loss: 0.0201 - val_acc: 0.9948
Epoch 225/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0235 - acc: 0.9927 - val_loss: 0.0201 - val_acc: 0.9950
Epoch 226/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9934 - val_loss: 0.0201 - val_acc: 0.9948
Epoch 227/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0240 - acc: 0.9930 - val_loss: 0.0201 - val_acc: 0.9948
Epoch 228/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0229 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 229/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 230/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 231/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0239 - acc: 0.9932 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 232/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0242 - acc: 0.9933 - val_loss: 0.0199 - val_acc: 0.9948
Epoch 233/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0199 - val_acc: 0.9948
Epoch 234/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0235 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 235/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 236/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 237/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 238/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9934 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 239/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0198 - val_acc: 0.9948
Epoch 240/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 241/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0225 - acc: 0.9942 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 242/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9923 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 243/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 244/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9940 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 245/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0233 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 246/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 247/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 248/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 249/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0238 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 250/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9948
Epoch 251/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0196 - val_acc: 0.9948
Epoch 252/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0215 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 253/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0229 - acc: 0.9934 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 254/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 255/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 256/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 257/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 258/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9948
Epoch 259/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9948
Epoch 260/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9934 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 261/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9948
Epoch 262/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9948
Epoch 263/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 264/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 265/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 266/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9942 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 267/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 268/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0216 - acc: 0.9942 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 269/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 270/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 271/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 272/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 273/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 274/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0222 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 275/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 276/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 277/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 278/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0216 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 279/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 280/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 281/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 282/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 283/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 284/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 285/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 286/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 287/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0214 - acc: 0.9939 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 288/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 289/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 290/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 291/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 292/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 293/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 294/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 295/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 296/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 297/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 298/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 299/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0206 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 300/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0187 - val_acc: 0.9950
Epoch 301/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0187 - val_acc: 0.9950
Epoch 302/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0187 - val_acc: 0.9950
Epoch 303/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9950
Epoch 304/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 305/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 306/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 307/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0219 - acc: 0.9940 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 308/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 309/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 310/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 311/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 312/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0186 - val_acc: 0.9950
Epoch 313/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0186 - val_acc: 0.9950
Epoch 314/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9950
Epoch 315/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 316/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0213 - acc: 0.9941 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 317/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 318/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 319/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 320/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 321/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 322/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 323/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9950
Epoch 324/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 325/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9938 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 326/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0216 - acc: 0.9941 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 327/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 328/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0207 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 329/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 330/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 331/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0209 - acc: 0.9943 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 332/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0204 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 333/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 334/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 335/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 336/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 337/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 338/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0211 - acc: 0.9943 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 339/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 340/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 341/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 342/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 343/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 344/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0203 - acc: 0.9942 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 345/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0220 - acc: 0.9936 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 346/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 347/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0212 - acc: 0.9927 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 348/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 349/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 350/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 351/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0208 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 352/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 353/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 354/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 355/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0209 - acc: 0.9938 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 356/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 357/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 358/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 359/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0216 - acc: 0.9939 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 360/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0203 - acc: 0.9944 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 361/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 362/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9938 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 363/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 364/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 365/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0201 - acc: 0.9943 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 366/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 367/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 368/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0200 - acc: 0.9945 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 369/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0202 - acc: 0.9943 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 370/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0203 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 371/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 372/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 373/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 374/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0200 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 375/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 376/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 377/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0206 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 378/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0203 - acc: 0.9941 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 379/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0198 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 380/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0201 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 381/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0202 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 382/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 383/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0195 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 384/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 385/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 386/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 387/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 388/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 389/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 390/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0205 - acc: 0.9933 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 391/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0203 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 392/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0207 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 393/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 394/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0200 - acc: 0.9939 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 395/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 396/500
20000/20000 [==============================] - 1s 56us/step - loss: 0.0202 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 397/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 398/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 399/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 400/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 401/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 402/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 403/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 404/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 405/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 406/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 407/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 408/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 409/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 410/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 411/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0201 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 412/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 413/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 414/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 415/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 416/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0197 - acc: 0.9945 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 417/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 418/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0193 - acc: 0.9939 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 419/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 420/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0198 - acc: 0.9948 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 421/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 422/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 423/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 424/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 425/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0199 - acc: 0.9943 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 426/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0191 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 427/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 428/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 429/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 430/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 431/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 432/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 433/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 434/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0197 - acc: 0.9936 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 435/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0195 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 436/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 437/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 438/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0190 - acc: 0.9949 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 439/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9939 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 440/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 441/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 442/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0191 - acc: 0.9943 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 443/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 444/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 445/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 446/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 447/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0200 - acc: 0.9939 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 448/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 449/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 450/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 451/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0203 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 452/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0194 - acc: 0.9946 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 453/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 454/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 455/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 456/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0198 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 457/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0196 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 458/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 459/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 460/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 461/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 462/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 463/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0192 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 464/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 465/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 466/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 467/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 468/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0191 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 469/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0188 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 470/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0189 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 471/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9939 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 472/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 473/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 474/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0191 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 475/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0193 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 476/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0181 - acc: 0.9952 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 477/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0193 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 478/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0196 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 479/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 480/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 481/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 482/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 483/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 484/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0198 - acc: 0.9945 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 485/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 486/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 487/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 488/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 489/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 490/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 491/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 492/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 493/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 494/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 495/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0199 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 496/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 497/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0199 - acc: 0.9942 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 498/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 499/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 500/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9950
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.2 , decay:1e-06
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 70us/step - loss: 0.6259 - acc: 0.6674 - val_loss: 0.4093 - val_acc: 0.8954
Epoch 2/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.3889 - acc: 0.8570 - val_loss: 0.2821 - val_acc: 0.9522
Epoch 3/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.2901 - acc: 0.9191 - val_loss: 0.2164 - val_acc: 0.9682
Epoch 4/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.2328 - acc: 0.9446 - val_loss: 0.1777 - val_acc: 0.9758
Epoch 5/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.1954 - acc: 0.9558 - val_loss: 0.1522 - val_acc: 0.9776
Epoch 6/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.1683 - acc: 0.9645 - val_loss: 0.1343 - val_acc: 0.9802
Epoch 7/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.1515 - acc: 0.9684 - val_loss: 0.1209 - val_acc: 0.9812
Epoch 8/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.1390 - acc: 0.9719 - val_loss: 0.1105 - val_acc: 0.9826
Epoch 9/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.1279 - acc: 0.9752 - val_loss: 0.1022 - val_acc: 0.9842
Epoch 10/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.1198 - acc: 0.9758 - val_loss: 0.0952 - val_acc: 0.9848
Epoch 11/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.1099 - acc: 0.9785 - val_loss: 0.0896 - val_acc: 0.9850
Epoch 12/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.1043 - acc: 0.9790 - val_loss: 0.0847 - val_acc: 0.9856
Epoch 13/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0984 - acc: 0.9799 - val_loss: 0.0806 - val_acc: 0.9862
Epoch 14/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0941 - acc: 0.9800 - val_loss: 0.0769 - val_acc: 0.9866
Epoch 15/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0891 - acc: 0.9827 - val_loss: 0.0737 - val_acc: 0.9868
Epoch 16/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0867 - acc: 0.9820 - val_loss: 0.0709 - val_acc: 0.9880
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0826 - acc: 0.9822 - val_loss: 0.0683 - val_acc: 0.9880
Epoch 18/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0795 - acc: 0.9841 - val_loss: 0.0660 - val_acc: 0.9884
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0769 - acc: 0.9846 - val_loss: 0.0639 - val_acc: 0.9888
Epoch 20/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0758 - acc: 0.9842 - val_loss: 0.0620 - val_acc: 0.9888
Epoch 21/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0734 - acc: 0.9838 - val_loss: 0.0603 - val_acc: 0.9888
Epoch 22/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0713 - acc: 0.9836 - val_loss: 0.0587 - val_acc: 0.9888
Epoch 23/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0693 - acc: 0.9847 - val_loss: 0.0572 - val_acc: 0.9890
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0673 - acc: 0.9843 - val_loss: 0.0558 - val_acc: 0.9892
Epoch 25/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0662 - acc: 0.9852 - val_loss: 0.0546 - val_acc: 0.9894
Epoch 26/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0646 - acc: 0.9861 - val_loss: 0.0534 - val_acc: 0.9894
Epoch 27/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0623 - acc: 0.9856 - val_loss: 0.0523 - val_acc: 0.9896
Epoch 28/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0625 - acc: 0.9859 - val_loss: 0.0512 - val_acc: 0.9896
Epoch 29/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0590 - acc: 0.9863 - val_loss: 0.0503 - val_acc: 0.9898
Epoch 30/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0599 - acc: 0.9869 - val_loss: 0.0493 - val_acc: 0.9900
Epoch 31/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0578 - acc: 0.9865 - val_loss: 0.0485 - val_acc: 0.9900
Epoch 32/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0580 - acc: 0.9859 - val_loss: 0.0476 - val_acc: 0.9902
Epoch 33/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0561 - acc: 0.9877 - val_loss: 0.0468 - val_acc: 0.9904
Epoch 34/500
20000/20000 [==============================] - 1s 55us/step - loss: 0.0545 - acc: 0.9882 - val_loss: 0.0461 - val_acc: 0.9904
Epoch 35/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0545 - acc: 0.9874 - val_loss: 0.0454 - val_acc: 0.9906
Epoch 36/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0547 - acc: 0.9866 - val_loss: 0.0447 - val_acc: 0.9908
Epoch 37/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0535 - acc: 0.9875 - val_loss: 0.0440 - val_acc: 0.9910
Epoch 38/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0515 - acc: 0.9881 - val_loss: 0.0434 - val_acc: 0.9910
Epoch 39/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0507 - acc: 0.9881 - val_loss: 0.0429 - val_acc: 0.9910
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0506 - acc: 0.9882 - val_loss: 0.0423 - val_acc: 0.9910
Epoch 41/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0503 - acc: 0.9881 - val_loss: 0.0418 - val_acc: 0.9910
Epoch 42/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0483 - acc: 0.9884 - val_loss: 0.0413 - val_acc: 0.9910
Epoch 43/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0502 - acc: 0.9876 - val_loss: 0.0408 - val_acc: 0.9910
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0483 - acc: 0.9883 - val_loss: 0.0403 - val_acc: 0.9910
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0474 - acc: 0.9894 - val_loss: 0.0398 - val_acc: 0.9910
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0468 - acc: 0.9883 - val_loss: 0.0394 - val_acc: 0.9910
Epoch 47/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0457 - acc: 0.9888 - val_loss: 0.0390 - val_acc: 0.9910
Epoch 48/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0451 - acc: 0.9892 - val_loss: 0.0386 - val_acc: 0.9910
Epoch 49/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0452 - acc: 0.9887 - val_loss: 0.0382 - val_acc: 0.9910
Epoch 50/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0448 - acc: 0.9896 - val_loss: 0.0378 - val_acc: 0.9912
Epoch 51/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0442 - acc: 0.9888 - val_loss: 0.0375 - val_acc: 0.9914
Epoch 52/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0449 - acc: 0.9884 - val_loss: 0.0371 - val_acc: 0.9918
Epoch 53/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0432 - acc: 0.9904 - val_loss: 0.0368 - val_acc: 0.9916
Epoch 54/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0429 - acc: 0.9893 - val_loss: 0.0365 - val_acc: 0.9918
Epoch 55/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0434 - acc: 0.9901 - val_loss: 0.0361 - val_acc: 0.9918
Epoch 56/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0421 - acc: 0.9898 - val_loss: 0.0358 - val_acc: 0.9924
Epoch 57/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0414 - acc: 0.9897 - val_loss: 0.0355 - val_acc: 0.9924
Epoch 58/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0410 - acc: 0.9907 - val_loss: 0.0352 - val_acc: 0.9924
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0407 - acc: 0.9900 - val_loss: 0.0349 - val_acc: 0.9924
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0411 - acc: 0.9897 - val_loss: 0.0347 - val_acc: 0.9924
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0413 - acc: 0.9896 - val_loss: 0.0344 - val_acc: 0.9924
Epoch 62/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0395 - acc: 0.9902 - val_loss: 0.0341 - val_acc: 0.9924
Epoch 63/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0395 - acc: 0.9900 - val_loss: 0.0339 - val_acc: 0.9924
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0392 - acc: 0.9901 - val_loss: 0.0336 - val_acc: 0.9924
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0390 - acc: 0.9900 - val_loss: 0.0334 - val_acc: 0.9926
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0392 - acc: 0.9903 - val_loss: 0.0331 - val_acc: 0.9926
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0390 - acc: 0.9900 - val_loss: 0.0329 - val_acc: 0.9926
Epoch 68/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0382 - acc: 0.9902 - val_loss: 0.0327 - val_acc: 0.9926
Epoch 69/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0379 - acc: 0.9906 - val_loss: 0.0325 - val_acc: 0.9926
Epoch 70/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0384 - acc: 0.9908 - val_loss: 0.0322 - val_acc: 0.9926
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0371 - acc: 0.9907 - val_loss: 0.0321 - val_acc: 0.9926
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0354 - acc: 0.9917 - val_loss: 0.0319 - val_acc: 0.9926
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0384 - acc: 0.9898 - val_loss: 0.0317 - val_acc: 0.9926
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0370 - acc: 0.9905 - val_loss: 0.0315 - val_acc: 0.9926
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0365 - acc: 0.9906 - val_loss: 0.0313 - val_acc: 0.9926
Epoch 76/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0364 - acc: 0.9906 - val_loss: 0.0311 - val_acc: 0.9926
Epoch 77/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0358 - acc: 0.9909 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0361 - acc: 0.9907 - val_loss: 0.0308 - val_acc: 0.9928
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0359 - acc: 0.9900 - val_loss: 0.0306 - val_acc: 0.9930
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0347 - acc: 0.9915 - val_loss: 0.0304 - val_acc: 0.9932
Epoch 81/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0358 - acc: 0.9908 - val_loss: 0.0302 - val_acc: 0.9932
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0345 - acc: 0.9909 - val_loss: 0.0301 - val_acc: 0.9932
Epoch 83/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0352 - acc: 0.9902 - val_loss: 0.0299 - val_acc: 0.9932
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0347 - acc: 0.9911 - val_loss: 0.0298 - val_acc: 0.9932
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0350 - acc: 0.9905 - val_loss: 0.0296 - val_acc: 0.9934
Epoch 86/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0345 - acc: 0.9919 - val_loss: 0.0295 - val_acc: 0.9934
Epoch 87/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0337 - acc: 0.9914 - val_loss: 0.0293 - val_acc: 0.9934
Epoch 88/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0345 - acc: 0.9914 - val_loss: 0.0292 - val_acc: 0.9934
Epoch 89/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0346 - acc: 0.9908 - val_loss: 0.0291 - val_acc: 0.9934
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0336 - acc: 0.9905 - val_loss: 0.0289 - val_acc: 0.9934
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0334 - acc: 0.9911 - val_loss: 0.0288 - val_acc: 0.9934
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0330 - acc: 0.9915 - val_loss: 0.0286 - val_acc: 0.9934
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0330 - acc: 0.9910 - val_loss: 0.0285 - val_acc: 0.9934
Epoch 94/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0336 - acc: 0.9909 - val_loss: 0.0284 - val_acc: 0.9934
Epoch 95/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0327 - acc: 0.9919 - val_loss: 0.0282 - val_acc: 0.9934
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0319 - acc: 0.9919 - val_loss: 0.0281 - val_acc: 0.9934
Epoch 97/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0320 - acc: 0.9918 - val_loss: 0.0280 - val_acc: 0.9934
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0323 - acc: 0.9911 - val_loss: 0.0279 - val_acc: 0.9934
Epoch 99/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0328 - acc: 0.9909 - val_loss: 0.0278 - val_acc: 0.9934
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9923 - val_loss: 0.0277 - val_acc: 0.9934
Epoch 101/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0320 - acc: 0.9919 - val_loss: 0.0276 - val_acc: 0.9934
Epoch 102/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0275 - val_acc: 0.9934
Epoch 103/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0312 - acc: 0.9917 - val_loss: 0.0273 - val_acc: 0.9936
Epoch 104/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0272 - val_acc: 0.9936
Epoch 105/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9923 - val_loss: 0.0271 - val_acc: 0.9938
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9938
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9936
Epoch 108/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9936
Epoch 109/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0312 - acc: 0.9916 - val_loss: 0.0268 - val_acc: 0.9936
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9920 - val_loss: 0.0267 - val_acc: 0.9936
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9938
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9920 - val_loss: 0.0264 - val_acc: 0.9938
Epoch 113/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0302 - acc: 0.9921 - val_loss: 0.0264 - val_acc: 0.9938
Epoch 114/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0308 - acc: 0.9909 - val_loss: 0.0262 - val_acc: 0.9940
Epoch 115/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0298 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9940
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0308 - acc: 0.9911 - val_loss: 0.0260 - val_acc: 0.9940
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9919 - val_loss: 0.0259 - val_acc: 0.9940
Epoch 118/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0297 - acc: 0.9924 - val_loss: 0.0259 - val_acc: 0.9940
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9921 - val_loss: 0.0258 - val_acc: 0.9940
Epoch 120/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0294 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9942
Epoch 121/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9918 - val_loss: 0.0256 - val_acc: 0.9942
Epoch 122/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0299 - acc: 0.9919 - val_loss: 0.0255 - val_acc: 0.9942
Epoch 123/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9920 - val_loss: 0.0255 - val_acc: 0.9942
Epoch 124/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9919 - val_loss: 0.0254 - val_acc: 0.9940
Epoch 125/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0284 - acc: 0.9922 - val_loss: 0.0253 - val_acc: 0.9940
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 127/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0288 - acc: 0.9920 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 128/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9940
Epoch 129/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9940
Epoch 130/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0284 - acc: 0.9925 - val_loss: 0.0250 - val_acc: 0.9940
Epoch 131/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0249 - val_acc: 0.9940
Epoch 132/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0285 - acc: 0.9921 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 133/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0283 - acc: 0.9925 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 134/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0281 - acc: 0.9919 - val_loss: 0.0247 - val_acc: 0.9942
Epoch 135/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0285 - acc: 0.9922 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 136/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9928 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 137/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0269 - acc: 0.9923 - val_loss: 0.0245 - val_acc: 0.9942
Epoch 138/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9920 - val_loss: 0.0245 - val_acc: 0.9942
Epoch 139/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9925 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 140/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0279 - acc: 0.9920 - val_loss: 0.0243 - val_acc: 0.9942
Epoch 141/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0260 - acc: 0.9937 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 142/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 143/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0275 - acc: 0.9924 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 144/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0266 - acc: 0.9926 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 145/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0274 - acc: 0.9928 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 146/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 147/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 148/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0268 - acc: 0.9927 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 149/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0264 - acc: 0.9927 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 150/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0276 - acc: 0.9920 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 151/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 152/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9919 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 153/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9926 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 154/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0270 - acc: 0.9926 - val_loss: 0.0235 - val_acc: 0.9944
Epoch 155/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9919 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 156/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9925 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9928 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 158/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 159/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 160/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0232 - val_acc: 0.9944
Epoch 161/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0260 - acc: 0.9927 - val_loss: 0.0231 - val_acc: 0.9944
Epoch 162/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0256 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9944
Epoch 163/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9925 - val_loss: 0.0231 - val_acc: 0.9944
Epoch 164/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0253 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9944
Epoch 165/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0262 - acc: 0.9930 - val_loss: 0.0229 - val_acc: 0.9944
Epoch 166/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9928 - val_loss: 0.0229 - val_acc: 0.9944
Epoch 167/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0267 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9944
Epoch 168/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0253 - acc: 0.9930 - val_loss: 0.0228 - val_acc: 0.9944
Epoch 169/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0228 - val_acc: 0.9944
Epoch 170/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9944
Epoch 171/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0227 - val_acc: 0.9944
Epoch 172/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9923 - val_loss: 0.0227 - val_acc: 0.9944
Epoch 173/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0259 - acc: 0.9924 - val_loss: 0.0226 - val_acc: 0.9944
Epoch 174/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 175/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9925 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 176/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9924 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 177/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9946
Epoch 178/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9946
Epoch 179/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9928 - val_loss: 0.0223 - val_acc: 0.9946
Epoch 180/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9946
Epoch 181/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0252 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9946
Epoch 182/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9926 - val_loss: 0.0222 - val_acc: 0.9946
Epoch 183/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9946
Epoch 184/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9946
Epoch 185/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 186/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 187/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9926 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 188/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 190/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 191/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 192/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9923 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 193/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9934 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 194/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0238 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 195/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 196/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 197/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 198/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 199/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 200/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 201/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9948
Epoch 202/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9948
Epoch 203/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9948
Epoch 204/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9948
Epoch 205/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0229 - acc: 0.9941 - val_loss: 0.0214 - val_acc: 0.9948
Epoch 206/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9938 - val_loss: 0.0214 - val_acc: 0.9948
Epoch 207/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9948
Epoch 208/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9948
Epoch 209/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9948
Epoch 210/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0212 - val_acc: 0.9948
Epoch 211/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9948
Epoch 212/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9948
Epoch 213/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0211 - val_acc: 0.9948
Epoch 214/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0211 - val_acc: 0.9948
Epoch 215/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9932 - val_loss: 0.0211 - val_acc: 0.9948
Epoch 216/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9948
Epoch 217/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9948
Epoch 218/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9948
Epoch 219/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9930 - val_loss: 0.0210 - val_acc: 0.9948
Epoch 220/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 221/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 222/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 223/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 224/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9948
Epoch 225/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9948
Epoch 226/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0208 - val_acc: 0.9948
Epoch 227/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0208 - val_acc: 0.9948
Epoch 228/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0233 - acc: 0.9925 - val_loss: 0.0207 - val_acc: 0.9948
Epoch 229/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0207 - val_acc: 0.9948
Epoch 230/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9930 - val_loss: 0.0207 - val_acc: 0.9948
Epoch 231/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9946
Epoch 232/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0206 - val_acc: 0.9948
Epoch 233/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 234/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 235/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 236/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 237/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0221 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 238/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9948
Epoch 239/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9930 - val_loss: 0.0205 - val_acc: 0.9948
Epoch 240/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0204 - val_acc: 0.9948
Epoch 241/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0204 - val_acc: 0.9948
Epoch 242/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0204 - val_acc: 0.9948
Epoch 243/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0221 - acc: 0.9941 - val_loss: 0.0204 - val_acc: 0.9948
Epoch 244/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 245/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9944 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 246/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 247/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 248/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 249/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 250/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 251/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9930 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 252/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 253/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 254/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 255/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0224 - acc: 0.9942 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 256/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 257/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 258/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 259/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 260/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 261/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0214 - acc: 0.9946 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 262/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 263/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 264/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 265/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 266/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 267/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 268/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 269/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0213 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 270/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 271/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 272/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 273/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9934 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 274/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 275/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 276/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 277/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 278/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 279/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 280/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9942 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 281/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 282/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 283/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 284/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9942 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 285/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 286/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 287/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 288/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 289/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9943 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 290/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 291/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 292/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9942 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 293/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 294/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 295/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9942 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 296/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 297/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 298/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 299/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9944 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 300/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 301/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 302/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 303/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 304/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 305/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 306/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 307/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 308/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 309/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 310/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 311/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0205 - acc: 0.9947 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 312/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 313/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 314/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 315/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 316/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 317/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 318/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 319/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0199 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 320/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 321/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 322/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 323/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9943 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 324/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0202 - acc: 0.9943 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 325/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 326/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 327/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 328/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 329/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0203 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 330/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 331/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 332/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0206 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 333/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 334/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 335/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 336/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 337/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0203 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9950
Epoch 338/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 339/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 340/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 341/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 342/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 343/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0197 - acc: 0.9944 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 344/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 345/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 346/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 347/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 348/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 349/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 350/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 351/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 352/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 353/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 354/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 355/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 356/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 357/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0190 - acc: 0.9942 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 358/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 359/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 360/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 361/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9948 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 362/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 363/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 364/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 365/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 366/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 367/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 368/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0199 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 369/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 370/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0198 - acc: 0.9943 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 371/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 372/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 373/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 374/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 375/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0196 - acc: 0.9935 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 376/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 377/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0195 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 378/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 379/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 380/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 381/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9944 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 382/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 383/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0201 - acc: 0.9942 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 384/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 385/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 386/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 387/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 388/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 389/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0199 - acc: 0.9934 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 390/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 391/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 392/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9951 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 393/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 394/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9953 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 395/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 396/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 397/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 398/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 399/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0192 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 400/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0184 - acc: 0.9948 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 401/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 402/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 403/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 404/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 405/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 406/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 407/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 408/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 409/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 410/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0186 - acc: 0.9953 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 411/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 412/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 413/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 414/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0185 - acc: 0.9945 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 415/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 416/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 417/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 418/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 419/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 420/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 421/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 422/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 423/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 424/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 425/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 426/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 427/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 428/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 429/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 430/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 431/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 432/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 433/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 434/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 435/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 436/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 437/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9948 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 438/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 439/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 440/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 441/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 442/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0191 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 443/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 444/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 445/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 446/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 447/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0179 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 448/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 449/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 450/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 451/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 452/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 453/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 454/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 455/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 456/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 457/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 458/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9948 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 459/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 460/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 461/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 462/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 463/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 464/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 465/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 466/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 467/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 468/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 469/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 470/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 471/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 472/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 473/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 474/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 475/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 476/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9937 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 477/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 478/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0181 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 479/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 480/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 481/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 482/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 483/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 484/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 485/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 486/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 487/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 488/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 489/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0178 - acc: 0.9951 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 490/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 491/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 492/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 493/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 494/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 495/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 496/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0176 - acc: 0.9943 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 497/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 498/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 499/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 500/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9946
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.5 , decay:0.0001
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 62us/step - loss: 0.5157 - acc: 0.7560 - val_loss: 0.3201 - val_acc: 0.9458
Epoch 2/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.2900 - acc: 0.9212 - val_loss: 0.2079 - val_acc: 0.9728
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.2078 - acc: 0.9559 - val_loss: 0.1589 - val_acc: 0.9790
Epoch 4/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1658 - acc: 0.9688 - val_loss: 0.1314 - val_acc: 0.9812
Epoch 5/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.1423 - acc: 0.9732 - val_loss: 0.1137 - val_acc: 0.9838
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1234 - acc: 0.9756 - val_loss: 0.1014 - val_acc: 0.9842
Epoch 7/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1124 - acc: 0.9776 - val_loss: 0.0921 - val_acc: 0.9844
Epoch 8/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.1023 - acc: 0.9804 - val_loss: 0.0850 - val_acc: 0.9852
Epoch 9/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0961 - acc: 0.9813 - val_loss: 0.0793 - val_acc: 0.9856
Epoch 10/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0896 - acc: 0.9825 - val_loss: 0.0746 - val_acc: 0.9858
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0843 - acc: 0.9838 - val_loss: 0.0707 - val_acc: 0.9864
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0794 - acc: 0.9840 - val_loss: 0.0673 - val_acc: 0.9868
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0762 - acc: 0.9842 - val_loss: 0.0645 - val_acc: 0.9876
Epoch 14/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0726 - acc: 0.9860 - val_loss: 0.0620 - val_acc: 0.9880
Epoch 15/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0704 - acc: 0.9854 - val_loss: 0.0597 - val_acc: 0.9882
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0695 - acc: 0.9848 - val_loss: 0.0578 - val_acc: 0.9886
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0667 - acc: 0.9850 - val_loss: 0.0560 - val_acc: 0.9888
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0630 - acc: 0.9867 - val_loss: 0.0544 - val_acc: 0.9890
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0619 - acc: 0.9864 - val_loss: 0.0530 - val_acc: 0.9892
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0608 - acc: 0.9868 - val_loss: 0.0516 - val_acc: 0.9898
Epoch 21/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0599 - acc: 0.9873 - val_loss: 0.0504 - val_acc: 0.9898
Epoch 22/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0582 - acc: 0.9881 - val_loss: 0.0493 - val_acc: 0.9898
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0571 - acc: 0.9880 - val_loss: 0.0483 - val_acc: 0.9900
Epoch 24/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0550 - acc: 0.9873 - val_loss: 0.0474 - val_acc: 0.9900
Epoch 25/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0551 - acc: 0.9879 - val_loss: 0.0465 - val_acc: 0.9906
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0544 - acc: 0.9878 - val_loss: 0.0457 - val_acc: 0.9906
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0508 - acc: 0.9892 - val_loss: 0.0449 - val_acc: 0.9906
Epoch 28/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0518 - acc: 0.9890 - val_loss: 0.0442 - val_acc: 0.9908
Epoch 29/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0511 - acc: 0.9888 - val_loss: 0.0435 - val_acc: 0.9908
Epoch 30/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0490 - acc: 0.9890 - val_loss: 0.0428 - val_acc: 0.9908
Epoch 31/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0482 - acc: 0.9893 - val_loss: 0.0422 - val_acc: 0.9908
Epoch 32/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0494 - acc: 0.9888 - val_loss: 0.0417 - val_acc: 0.9908
Epoch 33/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0492 - acc: 0.9886 - val_loss: 0.0411 - val_acc: 0.9910
Epoch 34/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0468 - acc: 0.9895 - val_loss: 0.0406 - val_acc: 0.9910
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0463 - acc: 0.9897 - val_loss: 0.0401 - val_acc: 0.9914
Epoch 36/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0452 - acc: 0.9900 - val_loss: 0.0397 - val_acc: 0.9916
Epoch 37/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0463 - acc: 0.9888 - val_loss: 0.0392 - val_acc: 0.9916
Epoch 38/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0455 - acc: 0.9891 - val_loss: 0.0388 - val_acc: 0.9916
Epoch 39/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0440 - acc: 0.9896 - val_loss: 0.0384 - val_acc: 0.9916
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0432 - acc: 0.9904 - val_loss: 0.0380 - val_acc: 0.9916
Epoch 41/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0431 - acc: 0.9905 - val_loss: 0.0377 - val_acc: 0.9916
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0432 - acc: 0.9899 - val_loss: 0.0373 - val_acc: 0.9916
Epoch 43/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0436 - acc: 0.9898 - val_loss: 0.0370 - val_acc: 0.9918
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0418 - acc: 0.9904 - val_loss: 0.0366 - val_acc: 0.9920
Epoch 45/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0419 - acc: 0.9897 - val_loss: 0.0363 - val_acc: 0.9922
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0410 - acc: 0.9904 - val_loss: 0.0360 - val_acc: 0.9922
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0412 - acc: 0.9898 - val_loss: 0.0357 - val_acc: 0.9922
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0406 - acc: 0.9899 - val_loss: 0.0354 - val_acc: 0.9922
Epoch 49/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0409 - acc: 0.9904 - val_loss: 0.0351 - val_acc: 0.9922
Epoch 50/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0401 - acc: 0.9909 - val_loss: 0.0349 - val_acc: 0.9922
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0405 - acc: 0.9906 - val_loss: 0.0347 - val_acc: 0.9922
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0398 - acc: 0.9899 - val_loss: 0.0344 - val_acc: 0.9922
Epoch 53/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0393 - acc: 0.9899 - val_loss: 0.0342 - val_acc: 0.9922
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0395 - acc: 0.9908 - val_loss: 0.0340 - val_acc: 0.9922
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0390 - acc: 0.9909 - val_loss: 0.0338 - val_acc: 0.9924
Epoch 56/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0382 - acc: 0.9904 - val_loss: 0.0335 - val_acc: 0.9924
Epoch 57/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0386 - acc: 0.9909 - val_loss: 0.0333 - val_acc: 0.9924
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0380 - acc: 0.9913 - val_loss: 0.0331 - val_acc: 0.9924
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0386 - acc: 0.9902 - val_loss: 0.0330 - val_acc: 0.9924
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0381 - acc: 0.9908 - val_loss: 0.0328 - val_acc: 0.9924
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0376 - acc: 0.9914 - val_loss: 0.0326 - val_acc: 0.9924
Epoch 62/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0380 - acc: 0.9909 - val_loss: 0.0324 - val_acc: 0.9924
Epoch 63/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0364 - acc: 0.9914 - val_loss: 0.0322 - val_acc: 0.9924
Epoch 64/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0361 - acc: 0.9909 - val_loss: 0.0320 - val_acc: 0.9924
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0359 - acc: 0.9912 - val_loss: 0.0319 - val_acc: 0.9926
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0356 - acc: 0.9910 - val_loss: 0.0317 - val_acc: 0.9926
Epoch 67/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0365 - acc: 0.9910 - val_loss: 0.0316 - val_acc: 0.9926
Epoch 68/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0358 - acc: 0.9914 - val_loss: 0.0314 - val_acc: 0.9926
Epoch 69/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0366 - acc: 0.9901 - val_loss: 0.0313 - val_acc: 0.9926
Epoch 70/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0351 - acc: 0.9917 - val_loss: 0.0311 - val_acc: 0.9926
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0366 - acc: 0.9910 - val_loss: 0.0310 - val_acc: 0.9926
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0348 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9926
Epoch 73/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0359 - acc: 0.9909 - val_loss: 0.0307 - val_acc: 0.9926
Epoch 74/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0357 - acc: 0.9905 - val_loss: 0.0306 - val_acc: 0.9926
Epoch 75/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0352 - acc: 0.9914 - val_loss: 0.0305 - val_acc: 0.9926
Epoch 76/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0356 - acc: 0.9915 - val_loss: 0.0303 - val_acc: 0.9926
Epoch 77/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0342 - acc: 0.9919 - val_loss: 0.0302 - val_acc: 0.9926
Epoch 78/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0337 - acc: 0.9923 - val_loss: 0.0301 - val_acc: 0.9926
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0341 - acc: 0.9918 - val_loss: 0.0300 - val_acc: 0.9926
Epoch 80/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0335 - acc: 0.9918 - val_loss: 0.0299 - val_acc: 0.9926
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0344 - acc: 0.9913 - val_loss: 0.0298 - val_acc: 0.9926
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0342 - acc: 0.9915 - val_loss: 0.0296 - val_acc: 0.9926
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0336 - acc: 0.9913 - val_loss: 0.0296 - val_acc: 0.9926
Epoch 84/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0337 - acc: 0.9917 - val_loss: 0.0295 - val_acc: 0.9926
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0333 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9926
Epoch 86/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0347 - acc: 0.9902 - val_loss: 0.0293 - val_acc: 0.9926
Epoch 87/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0333 - acc: 0.9913 - val_loss: 0.0292 - val_acc: 0.9926
Epoch 88/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0327 - acc: 0.9916 - val_loss: 0.0291 - val_acc: 0.9926
Epoch 89/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0338 - acc: 0.9917 - val_loss: 0.0290 - val_acc: 0.9926
Epoch 90/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0336 - acc: 0.9914 - val_loss: 0.0289 - val_acc: 0.9926
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0288 - val_acc: 0.9926
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9913 - val_loss: 0.0287 - val_acc: 0.9926
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0323 - acc: 0.9920 - val_loss: 0.0286 - val_acc: 0.9926
Epoch 94/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0333 - acc: 0.9909 - val_loss: 0.0285 - val_acc: 0.9926
Epoch 95/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0332 - acc: 0.9910 - val_loss: 0.0284 - val_acc: 0.9926
Epoch 96/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0325 - acc: 0.9923 - val_loss: 0.0283 - val_acc: 0.9926
Epoch 97/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0320 - acc: 0.9924 - val_loss: 0.0283 - val_acc: 0.9926
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0323 - acc: 0.9916 - val_loss: 0.0282 - val_acc: 0.9926
Epoch 99/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0320 - acc: 0.9915 - val_loss: 0.0281 - val_acc: 0.9926
Epoch 100/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0323 - acc: 0.9914 - val_loss: 0.0280 - val_acc: 0.9928
Epoch 101/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0322 - acc: 0.9918 - val_loss: 0.0280 - val_acc: 0.9928
Epoch 102/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0321 - acc: 0.9915 - val_loss: 0.0279 - val_acc: 0.9930
Epoch 103/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0323 - acc: 0.9917 - val_loss: 0.0278 - val_acc: 0.9930
Epoch 104/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0322 - acc: 0.9914 - val_loss: 0.0277 - val_acc: 0.9930
Epoch 105/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0321 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9930
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0328 - acc: 0.9912 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0317 - acc: 0.9919 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 108/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0317 - acc: 0.9915 - val_loss: 0.0275 - val_acc: 0.9930
Epoch 109/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0320 - acc: 0.9919 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0326 - acc: 0.9918 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0319 - acc: 0.9919 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0301 - acc: 0.9922 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 113/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0321 - acc: 0.9919 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 114/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0311 - acc: 0.9919 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 115/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0313 - acc: 0.9915 - val_loss: 0.0270 - val_acc: 0.9932
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9918 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9917 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 118/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0300 - acc: 0.9923 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9921 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 120/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0267 - val_acc: 0.9932
Epoch 121/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0310 - acc: 0.9922 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 122/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9923 - val_loss: 0.0266 - val_acc: 0.9934
Epoch 123/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9928 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 124/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9923 - val_loss: 0.0265 - val_acc: 0.9934
Epoch 125/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0308 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0306 - acc: 0.9918 - val_loss: 0.0264 - val_acc: 0.9934
Epoch 127/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0305 - acc: 0.9926 - val_loss: 0.0263 - val_acc: 0.9936
Epoch 128/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0299 - acc: 0.9920 - val_loss: 0.0263 - val_acc: 0.9936
Epoch 129/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9923 - val_loss: 0.0262 - val_acc: 0.9936
Epoch 130/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0262 - val_acc: 0.9936
Epoch 131/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0302 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9936
Epoch 132/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9925 - val_loss: 0.0261 - val_acc: 0.9936
Epoch 133/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0260 - val_acc: 0.9938
Epoch 134/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9922 - val_loss: 0.0260 - val_acc: 0.9938
Epoch 135/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9919 - val_loss: 0.0259 - val_acc: 0.9940
Epoch 136/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0284 - acc: 0.9926 - val_loss: 0.0259 - val_acc: 0.9940
Epoch 137/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0297 - acc: 0.9918 - val_loss: 0.0259 - val_acc: 0.9938
Epoch 138/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9923 - val_loss: 0.0258 - val_acc: 0.9938
Epoch 139/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9918 - val_loss: 0.0258 - val_acc: 0.9940
Epoch 140/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 141/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9914 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 142/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9931 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 143/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0295 - acc: 0.9920 - val_loss: 0.0256 - val_acc: 0.9940
Epoch 144/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0256 - val_acc: 0.9940
Epoch 145/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0287 - acc: 0.9921 - val_loss: 0.0255 - val_acc: 0.9940
Epoch 146/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0255 - val_acc: 0.9940
Epoch 147/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0293 - acc: 0.9924 - val_loss: 0.0255 - val_acc: 0.9942
Epoch 148/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9922 - val_loss: 0.0254 - val_acc: 0.9942
Epoch 149/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9924 - val_loss: 0.0254 - val_acc: 0.9942
Epoch 150/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0286 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9942
Epoch 151/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0253 - val_acc: 0.9942
Epoch 152/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0285 - acc: 0.9924 - val_loss: 0.0253 - val_acc: 0.9942
Epoch 153/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0284 - acc: 0.9929 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 154/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9924 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 155/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0267 - acc: 0.9933 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 156/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9929 - val_loss: 0.0251 - val_acc: 0.9942
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9933 - val_loss: 0.0251 - val_acc: 0.9942
Epoch 158/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9942
Epoch 159/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0290 - acc: 0.9921 - val_loss: 0.0250 - val_acc: 0.9942
Epoch 160/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0278 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9942
Epoch 161/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9924 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 162/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0279 - acc: 0.9929 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 163/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0280 - acc: 0.9927 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 164/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0287 - acc: 0.9918 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 165/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0286 - acc: 0.9919 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 166/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0274 - acc: 0.9926 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 167/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0284 - acc: 0.9928 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 168/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9921 - val_loss: 0.0247 - val_acc: 0.9942
Epoch 169/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0278 - acc: 0.9923 - val_loss: 0.0247 - val_acc: 0.9942
Epoch 170/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0267 - acc: 0.9930 - val_loss: 0.0247 - val_acc: 0.9942
Epoch 171/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0279 - acc: 0.9928 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 172/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0279 - acc: 0.9923 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 173/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9928 - val_loss: 0.0246 - val_acc: 0.9942
Epoch 174/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9926 - val_loss: 0.0245 - val_acc: 0.9942
Epoch 175/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0275 - acc: 0.9928 - val_loss: 0.0245 - val_acc: 0.9942
Epoch 176/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0284 - acc: 0.9926 - val_loss: 0.0245 - val_acc: 0.9942
Epoch 177/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0276 - acc: 0.9924 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 178/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0284 - acc: 0.9924 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 179/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0276 - acc: 0.9924 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 180/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 181/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9923 - val_loss: 0.0243 - val_acc: 0.9942
Epoch 182/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9924 - val_loss: 0.0243 - val_acc: 0.9942
Epoch 183/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9942
Epoch 184/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0270 - acc: 0.9932 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 185/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0280 - acc: 0.9927 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 186/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0266 - acc: 0.9936 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 187/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0274 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 188/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0272 - acc: 0.9919 - val_loss: 0.0242 - val_acc: 0.9942
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9929 - val_loss: 0.0241 - val_acc: 0.9942
Epoch 190/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 191/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0282 - acc: 0.9920 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 192/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0268 - acc: 0.9923 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 193/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0276 - acc: 0.9926 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 194/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0279 - acc: 0.9925 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 195/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0275 - acc: 0.9929 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 196/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0272 - acc: 0.9928 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 197/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0277 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 198/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 199/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9931 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 200/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 201/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0276 - acc: 0.9923 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 202/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0267 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 203/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9928 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 204/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0272 - acc: 0.9924 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 205/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0265 - acc: 0.9932 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 206/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0269 - acc: 0.9926 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 207/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9930 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 208/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9919 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 209/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0269 - acc: 0.9923 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 210/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0270 - acc: 0.9923 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 211/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0260 - acc: 0.9927 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 212/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9930 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 213/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0262 - acc: 0.9927 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 214/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0269 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 215/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0266 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9944
Epoch 216/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0235 - val_acc: 0.9944
Epoch 217/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0270 - acc: 0.9922 - val_loss: 0.0235 - val_acc: 0.9944
Epoch 218/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0268 - acc: 0.9923 - val_loss: 0.0235 - val_acc: 0.9944
Epoch 219/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9922 - val_loss: 0.0235 - val_acc: 0.9944
Epoch 220/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 221/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0262 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 222/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0269 - acc: 0.9922 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 223/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0265 - acc: 0.9930 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 224/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 225/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 226/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0263 - acc: 0.9922 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 227/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0255 - acc: 0.9928 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 228/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0260 - acc: 0.9930 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 229/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9944
Epoch 230/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9944
Epoch 231/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0260 - acc: 0.9936 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 232/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0257 - acc: 0.9930 - val_loss: 0.0232 - val_acc: 0.9948
Epoch 233/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0262 - acc: 0.9931 - val_loss: 0.0232 - val_acc: 0.9948
Epoch 234/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0260 - acc: 0.9931 - val_loss: 0.0232 - val_acc: 0.9948
Epoch 235/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0270 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9948
Epoch 236/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9948
Epoch 237/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0259 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9948
Epoch 238/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9948
Epoch 239/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9948
Epoch 240/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9948
Epoch 241/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0262 - acc: 0.9923 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 242/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0255 - acc: 0.9925 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 243/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0258 - acc: 0.9923 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 244/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0268 - acc: 0.9923 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 245/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9936 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 246/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 247/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9926 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 248/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 249/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9930 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 250/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0262 - acc: 0.9923 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 251/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 252/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0266 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9948
Epoch 253/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0264 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 254/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 255/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 256/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 257/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 258/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9948
Epoch 259/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0253 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 260/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0260 - acc: 0.9922 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 261/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0248 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 262/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0255 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 263/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0258 - acc: 0.9934 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 264/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0259 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 265/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0254 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 266/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9948
Epoch 267/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 268/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0256 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 269/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0251 - acc: 0.9934 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 270/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0254 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 271/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 272/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 273/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0257 - acc: 0.9925 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 274/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 275/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0257 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 276/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0255 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 277/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0258 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 278/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 279/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0257 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 280/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0251 - acc: 0.9932 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 281/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 282/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0257 - acc: 0.9920 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 283/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 284/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0251 - acc: 0.9927 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 285/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 286/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 287/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 288/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 289/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 290/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 291/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 292/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 293/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 294/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 295/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 296/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 297/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 298/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 299/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9937 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 300/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 301/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 302/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 303/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9937 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 304/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 305/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 306/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 307/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 308/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9927 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 309/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9937 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 310/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 311/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 312/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 313/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 314/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0252 - acc: 0.9923 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 315/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 316/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 317/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 318/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 319/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 320/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 321/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 322/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 323/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9935 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 324/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 325/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 326/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 327/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9937 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 328/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 329/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0252 - acc: 0.9924 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 330/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0238 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 331/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 332/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 333/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9948
Epoch 334/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9928 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 335/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 336/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 337/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 338/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 339/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 340/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 341/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 342/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 343/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9926 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 344/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 345/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 346/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 347/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 348/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 349/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 350/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 351/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 352/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 353/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 354/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9948
Epoch 355/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 356/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 357/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9938 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 358/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 359/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 360/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 361/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9948
Epoch 362/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 363/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 364/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 365/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 366/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9950
Epoch 367/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 368/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 369/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 370/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9930 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 371/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 372/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0232 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 373/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 374/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 375/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 376/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 377/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 378/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9950
Epoch 379/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 380/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 381/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 382/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 383/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 384/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 385/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0238 - acc: 0.9927 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 386/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 387/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0238 - acc: 0.9934 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 388/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 389/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 390/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 391/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9950
Epoch 392/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 393/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 394/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 395/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 396/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 397/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 398/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 399/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9943 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 400/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 401/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 402/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9938 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 403/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 404/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 405/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9950
Epoch 406/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 407/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 408/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 409/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 410/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 411/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 412/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 413/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 414/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 415/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 416/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 417/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 418/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 419/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0212 - val_acc: 0.9950
Epoch 420/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 421/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 422/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 423/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 424/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 425/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 426/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 427/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 428/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0238 - acc: 0.9928 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 429/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0234 - acc: 0.9932 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 430/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9940 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 431/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9930 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 432/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 433/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 434/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 435/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9934 - val_loss: 0.0211 - val_acc: 0.9950
Epoch 436/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 437/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 438/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 439/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 440/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9934 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 441/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9928 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 442/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 443/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 444/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 445/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 446/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 447/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 448/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 449/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 450/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 451/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0210 - val_acc: 0.9950
Epoch 452/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9939 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 453/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9938 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 454/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0232 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 455/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0231 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 456/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0232 - acc: 0.9934 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 457/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0227 - acc: 0.9938 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 458/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0237 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 459/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 460/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 461/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 462/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 463/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 464/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 465/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 466/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 467/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 468/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 469/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9950
Epoch 470/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 471/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 472/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 473/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 474/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0229 - acc: 0.9934 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 475/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 476/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 477/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 478/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 479/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 480/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 481/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 482/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 483/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9928 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 484/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 485/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 486/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 487/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 488/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9950
Epoch 489/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 490/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 491/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 492/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 493/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0229 - acc: 0.9940 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 494/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 495/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 496/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 497/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0233 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 498/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 499/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 500/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0207 - val_acc: 0.9950
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.5 , decay:1e-05
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 63us/step - loss: 0.4919 - acc: 0.7745 - val_loss: 0.3030 - val_acc: 0.9570
Epoch 2/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.2814 - acc: 0.9286 - val_loss: 0.1970 - val_acc: 0.9758
Epoch 3/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.2011 - acc: 0.9573 - val_loss: 0.1502 - val_acc: 0.9812
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1625 - acc: 0.9674 - val_loss: 0.1237 - val_acc: 0.9844
Epoch 5/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1371 - acc: 0.9733 - val_loss: 0.1068 - val_acc: 0.9850
Epoch 6/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.1206 - acc: 0.9763 - val_loss: 0.0948 - val_acc: 0.9868
Epoch 7/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.1082 - acc: 0.9786 - val_loss: 0.0861 - val_acc: 0.9876
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0993 - acc: 0.9795 - val_loss: 0.0793 - val_acc: 0.9878
Epoch 9/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0907 - acc: 0.9820 - val_loss: 0.0738 - val_acc: 0.9884
Epoch 10/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0865 - acc: 0.9823 - val_loss: 0.0693 - val_acc: 0.9884
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0795 - acc: 0.9837 - val_loss: 0.0656 - val_acc: 0.9888
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0764 - acc: 0.9828 - val_loss: 0.0624 - val_acc: 0.9894
Epoch 13/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0718 - acc: 0.9849 - val_loss: 0.0596 - val_acc: 0.9898
Epoch 14/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0708 - acc: 0.9855 - val_loss: 0.0572 - val_acc: 0.9898
Epoch 15/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0669 - acc: 0.9851 - val_loss: 0.0551 - val_acc: 0.9902
Epoch 16/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0642 - acc: 0.9859 - val_loss: 0.0532 - val_acc: 0.9902
Epoch 17/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0625 - acc: 0.9867 - val_loss: 0.0515 - val_acc: 0.9904
Epoch 18/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0608 - acc: 0.9873 - val_loss: 0.0499 - val_acc: 0.9906
Epoch 19/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0589 - acc: 0.9861 - val_loss: 0.0485 - val_acc: 0.9908
Epoch 20/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0568 - acc: 0.9873 - val_loss: 0.0473 - val_acc: 0.9910
Epoch 21/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0553 - acc: 0.9880 - val_loss: 0.0461 - val_acc: 0.9910
Epoch 22/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0531 - acc: 0.9882 - val_loss: 0.0450 - val_acc: 0.9912
Epoch 23/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0518 - acc: 0.9882 - val_loss: 0.0440 - val_acc: 0.9912
Epoch 24/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0500 - acc: 0.9884 - val_loss: 0.0431 - val_acc: 0.9912
Epoch 25/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0499 - acc: 0.9879 - val_loss: 0.0423 - val_acc: 0.9914
Epoch 26/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0492 - acc: 0.9882 - val_loss: 0.0415 - val_acc: 0.9912
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0480 - acc: 0.9884 - val_loss: 0.0407 - val_acc: 0.9914
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0467 - acc: 0.9883 - val_loss: 0.0400 - val_acc: 0.9914
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0456 - acc: 0.9891 - val_loss: 0.0393 - val_acc: 0.9914
Epoch 30/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0455 - acc: 0.9891 - val_loss: 0.0387 - val_acc: 0.9918
Epoch 31/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0454 - acc: 0.9893 - val_loss: 0.0381 - val_acc: 0.9922
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0441 - acc: 0.9890 - val_loss: 0.0376 - val_acc: 0.9920
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0437 - acc: 0.9890 - val_loss: 0.0371 - val_acc: 0.9920
Epoch 34/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0431 - acc: 0.9893 - val_loss: 0.0366 - val_acc: 0.9922
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0432 - acc: 0.9890 - val_loss: 0.0361 - val_acc: 0.9922
Epoch 36/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0418 - acc: 0.9899 - val_loss: 0.0356 - val_acc: 0.9922
Epoch 37/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0416 - acc: 0.9896 - val_loss: 0.0352 - val_acc: 0.9922
Epoch 38/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0401 - acc: 0.9904 - val_loss: 0.0348 - val_acc: 0.9922
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0401 - acc: 0.9896 - val_loss: 0.0344 - val_acc: 0.9922
Epoch 40/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0387 - acc: 0.9909 - val_loss: 0.0340 - val_acc: 0.9922
Epoch 41/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0384 - acc: 0.9905 - val_loss: 0.0336 - val_acc: 0.9922
Epoch 42/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0379 - acc: 0.9908 - val_loss: 0.0333 - val_acc: 0.9922
Epoch 43/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0379 - acc: 0.9907 - val_loss: 0.0330 - val_acc: 0.9922
Epoch 44/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0369 - acc: 0.9909 - val_loss: 0.0327 - val_acc: 0.9924
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0368 - acc: 0.9909 - val_loss: 0.0323 - val_acc: 0.9924
Epoch 46/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0370 - acc: 0.9901 - val_loss: 0.0320 - val_acc: 0.9924
Epoch 47/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0357 - acc: 0.9911 - val_loss: 0.0318 - val_acc: 0.9924
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0365 - acc: 0.9913 - val_loss: 0.0315 - val_acc: 0.9928
Epoch 49/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0358 - acc: 0.9910 - val_loss: 0.0312 - val_acc: 0.9928
Epoch 50/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0359 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 51/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0362 - acc: 0.9905 - val_loss: 0.0307 - val_acc: 0.9930
Epoch 52/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0352 - acc: 0.9914 - val_loss: 0.0305 - val_acc: 0.9930
Epoch 53/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0336 - acc: 0.9920 - val_loss: 0.0302 - val_acc: 0.9930
Epoch 54/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0343 - acc: 0.9915 - val_loss: 0.0300 - val_acc: 0.9930
Epoch 55/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0346 - acc: 0.9901 - val_loss: 0.0298 - val_acc: 0.9930
Epoch 56/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0337 - acc: 0.9910 - val_loss: 0.0296 - val_acc: 0.9930
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0342 - acc: 0.9913 - val_loss: 0.0294 - val_acc: 0.9930
Epoch 58/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0333 - acc: 0.9918 - val_loss: 0.0292 - val_acc: 0.9930
Epoch 59/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0335 - acc: 0.9910 - val_loss: 0.0290 - val_acc: 0.9930
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0331 - acc: 0.9908 - val_loss: 0.0288 - val_acc: 0.9932
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0336 - acc: 0.9908 - val_loss: 0.0286 - val_acc: 0.9932
Epoch 62/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0325 - acc: 0.9918 - val_loss: 0.0284 - val_acc: 0.9932
Epoch 63/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0329 - acc: 0.9913 - val_loss: 0.0283 - val_acc: 0.9932
Epoch 64/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0321 - acc: 0.9918 - val_loss: 0.0281 - val_acc: 0.9932
Epoch 65/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0315 - acc: 0.9916 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 66/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0320 - acc: 0.9914 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0318 - acc: 0.9910 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 68/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0315 - acc: 0.9917 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 69/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0307 - acc: 0.9924 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 70/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0310 - acc: 0.9917 - val_loss: 0.0272 - val_acc: 0.9932
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0307 - acc: 0.9913 - val_loss: 0.0271 - val_acc: 0.9932
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9932
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0305 - acc: 0.9922 - val_loss: 0.0268 - val_acc: 0.9932
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0300 - acc: 0.9918 - val_loss: 0.0267 - val_acc: 0.9932
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9919 - val_loss: 0.0266 - val_acc: 0.9932
Epoch 76/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0295 - acc: 0.9924 - val_loss: 0.0265 - val_acc: 0.9932
Epoch 77/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0295 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9932
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0262 - val_acc: 0.9932
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0282 - acc: 0.9928 - val_loss: 0.0261 - val_acc: 0.9932
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0296 - acc: 0.9922 - val_loss: 0.0260 - val_acc: 0.9932
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0258 - val_acc: 0.9932
Epoch 82/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0285 - acc: 0.9927 - val_loss: 0.0257 - val_acc: 0.9932
Epoch 83/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0286 - acc: 0.9922 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 84/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0293 - acc: 0.9914 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 85/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9924 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 86/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0287 - acc: 0.9921 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 87/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0285 - acc: 0.9921 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 88/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0276 - acc: 0.9931 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9928 - val_loss: 0.0251 - val_acc: 0.9932
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0250 - val_acc: 0.9932
Epoch 91/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0271 - acc: 0.9927 - val_loss: 0.0249 - val_acc: 0.9932
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9931 - val_loss: 0.0248 - val_acc: 0.9932
Epoch 93/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0274 - acc: 0.9924 - val_loss: 0.0247 - val_acc: 0.9932
Epoch 94/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0264 - acc: 0.9922 - val_loss: 0.0246 - val_acc: 0.9932
Epoch 95/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0267 - acc: 0.9927 - val_loss: 0.0245 - val_acc: 0.9932
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0274 - acc: 0.9924 - val_loss: 0.0244 - val_acc: 0.9932
Epoch 97/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0270 - acc: 0.9928 - val_loss: 0.0244 - val_acc: 0.9932
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0243 - val_acc: 0.9932
Epoch 99/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0271 - acc: 0.9928 - val_loss: 0.0242 - val_acc: 0.9932
Epoch 100/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0241 - val_acc: 0.9932
Epoch 101/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0240 - val_acc: 0.9932
Epoch 102/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0266 - acc: 0.9927 - val_loss: 0.0240 - val_acc: 0.9932
Epoch 103/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0271 - acc: 0.9924 - val_loss: 0.0239 - val_acc: 0.9932
Epoch 104/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9926 - val_loss: 0.0238 - val_acc: 0.9932
Epoch 105/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0258 - acc: 0.9928 - val_loss: 0.0238 - val_acc: 0.9932
Epoch 106/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0256 - acc: 0.9935 - val_loss: 0.0237 - val_acc: 0.9932
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9932
Epoch 108/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0266 - acc: 0.9921 - val_loss: 0.0236 - val_acc: 0.9932
Epoch 109/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0250 - acc: 0.9926 - val_loss: 0.0235 - val_acc: 0.9932
Epoch 110/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9932
Epoch 111/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0259 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9932
Epoch 112/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0252 - acc: 0.9930 - val_loss: 0.0233 - val_acc: 0.9932
Epoch 113/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0233 - val_acc: 0.9932
Epoch 114/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0247 - acc: 0.9936 - val_loss: 0.0232 - val_acc: 0.9932
Epoch 115/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0250 - acc: 0.9933 - val_loss: 0.0232 - val_acc: 0.9932
Epoch 116/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0255 - acc: 0.9925 - val_loss: 0.0231 - val_acc: 0.9932
Epoch 117/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0249 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9932
Epoch 118/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0262 - acc: 0.9923 - val_loss: 0.0230 - val_acc: 0.9934
Epoch 119/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0255 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9934
Epoch 120/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0248 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9934
Epoch 121/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0251 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9934
Epoch 122/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0254 - acc: 0.9924 - val_loss: 0.0227 - val_acc: 0.9934
Epoch 123/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0250 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9934
Epoch 124/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0246 - acc: 0.9933 - val_loss: 0.0226 - val_acc: 0.9934
Epoch 125/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9934
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9932 - val_loss: 0.0225 - val_acc: 0.9934
Epoch 127/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0250 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9934
Epoch 128/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0246 - acc: 0.9927 - val_loss: 0.0224 - val_acc: 0.9934
Epoch 129/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0243 - acc: 0.9930 - val_loss: 0.0224 - val_acc: 0.9936
Epoch 130/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0247 - acc: 0.9924 - val_loss: 0.0223 - val_acc: 0.9936
Epoch 131/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0249 - acc: 0.9927 - val_loss: 0.0223 - val_acc: 0.9936
Epoch 132/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9936
Epoch 133/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9936
Epoch 134/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9936
Epoch 135/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9936
Epoch 136/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9936
Epoch 137/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9936
Epoch 138/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9936
Epoch 139/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9936
Epoch 140/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9924 - val_loss: 0.0219 - val_acc: 0.9936
Epoch 141/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9936
Epoch 142/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9938
Epoch 143/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9938
Epoch 144/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9938 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 145/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9938
Epoch 146/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 147/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9938
Epoch 148/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9938
Epoch 149/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0236 - acc: 0.9930 - val_loss: 0.0216 - val_acc: 0.9938
Epoch 150/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 151/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 152/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 153/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0232 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 154/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 155/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0231 - acc: 0.9930 - val_loss: 0.0214 - val_acc: 0.9940
Epoch 156/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9942
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 158/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9940
Epoch 159/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9940
Epoch 160/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9942
Epoch 161/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9942
Epoch 162/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9942
Epoch 163/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9942
Epoch 164/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9930 - val_loss: 0.0211 - val_acc: 0.9942
Epoch 165/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 166/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 167/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 168/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 169/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 170/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 171/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 172/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 173/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 174/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 175/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 176/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 177/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 178/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 179/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0217 - acc: 0.9939 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 180/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 181/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 182/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9942 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 183/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 184/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 185/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0213 - acc: 0.9942 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 186/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 187/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 188/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 190/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 191/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 192/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 193/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 194/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 195/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 196/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 197/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 198/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9942
Epoch 199/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9939 - val_loss: 0.0201 - val_acc: 0.9942
Epoch 200/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0201 - val_acc: 0.9942
Epoch 201/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 202/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 203/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 204/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 205/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 206/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 207/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9942
Epoch 208/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0199 - val_acc: 0.9942
Epoch 209/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9942
Epoch 210/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9941 - val_loss: 0.0199 - val_acc: 0.9942
Epoch 211/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0199 - val_acc: 0.9942
Epoch 212/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9942
Epoch 213/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 214/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 215/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 216/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 217/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 218/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9942
Epoch 219/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9942
Epoch 220/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9942
Epoch 221/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0197 - val_acc: 0.9942
Epoch 222/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 223/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 224/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 225/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 226/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9950 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 227/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9947 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 228/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 229/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 230/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 231/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 232/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 233/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 234/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 235/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 236/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 237/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 238/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 239/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 240/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 241/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 242/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0196 - acc: 0.9950 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 243/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 244/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9942 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 245/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9946 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 246/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 247/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 248/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 249/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 250/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 251/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 252/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0194 - acc: 0.9943 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 253/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 254/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 255/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 256/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 257/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 258/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 259/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 260/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9943 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 261/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 262/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 263/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 264/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9946 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 265/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 266/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 267/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 268/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9946 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 269/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 270/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 271/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0198 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 272/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 273/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 274/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 275/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 276/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 277/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 278/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9948 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 279/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 280/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 281/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 282/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 283/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 284/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 285/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 286/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 287/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 288/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0188 - acc: 0.9949 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 289/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9948 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 290/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0190 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 291/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 292/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9943 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 293/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 294/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9942 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 295/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 296/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 297/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9949 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 298/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9943 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 299/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 300/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9949 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 301/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 302/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 303/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 304/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 305/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9948 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 306/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 307/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 308/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 309/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 310/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0188 - acc: 0.9948 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 311/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0188 - acc: 0.9953 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 312/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 313/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 314/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9949 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 315/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 316/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 317/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 318/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0182 - acc: 0.9950 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 319/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 320/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0191 - acc: 0.9943 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 321/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 322/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 323/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 324/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 325/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 326/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 327/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 328/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 329/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 330/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 331/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 332/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 333/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 334/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 335/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 336/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 337/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9943 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 338/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 339/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 340/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 341/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 342/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 343/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9936 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 344/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 345/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 346/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 347/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 348/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 349/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 350/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 351/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 352/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 353/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 354/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0177 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 355/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 356/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 357/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 358/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 359/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 360/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 361/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 362/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0192 - acc: 0.9943 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 363/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 364/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9939 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 365/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 366/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0176 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 367/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 368/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 00368: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.5 , decay:1e-06
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 59us/step - loss: 0.4970 - acc: 0.7706 - val_loss: 0.2994 - val_acc: 0.9702
Epoch 2/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.2786 - acc: 0.9300 - val_loss: 0.1928 - val_acc: 0.9844
Epoch 3/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.2013 - acc: 0.9594 - val_loss: 0.1456 - val_acc: 0.9866
Epoch 4/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.1579 - acc: 0.9707 - val_loss: 0.1195 - val_acc: 0.9888
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1350 - acc: 0.9756 - val_loss: 0.1026 - val_acc: 0.9888
Epoch 6/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.1169 - acc: 0.9791 - val_loss: 0.0909 - val_acc: 0.9884
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.1042 - acc: 0.9815 - val_loss: 0.0823 - val_acc: 0.9890
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0948 - acc: 0.9829 - val_loss: 0.0756 - val_acc: 0.9896
Epoch 9/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0899 - acc: 0.9830 - val_loss: 0.0703 - val_acc: 0.9902
Epoch 10/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0830 - acc: 0.9843 - val_loss: 0.0660 - val_acc: 0.9904
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0780 - acc: 0.9852 - val_loss: 0.0624 - val_acc: 0.9904
Epoch 12/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0749 - acc: 0.9843 - val_loss: 0.0592 - val_acc: 0.9910
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0704 - acc: 0.9867 - val_loss: 0.0565 - val_acc: 0.9910
Epoch 14/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0679 - acc: 0.9859 - val_loss: 0.0542 - val_acc: 0.9910
Epoch 15/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0653 - acc: 0.9873 - val_loss: 0.0522 - val_acc: 0.9910
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0625 - acc: 0.9867 - val_loss: 0.0503 - val_acc: 0.9910
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0588 - acc: 0.9882 - val_loss: 0.0487 - val_acc: 0.9910
Epoch 18/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0593 - acc: 0.9871 - val_loss: 0.0472 - val_acc: 0.9910
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0573 - acc: 0.9890 - val_loss: 0.0458 - val_acc: 0.9912
Epoch 20/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0544 - acc: 0.9888 - val_loss: 0.0446 - val_acc: 0.9912
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0531 - acc: 0.9892 - val_loss: 0.0435 - val_acc: 0.9914
Epoch 22/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0525 - acc: 0.9890 - val_loss: 0.0424 - val_acc: 0.9920
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0504 - acc: 0.9896 - val_loss: 0.0415 - val_acc: 0.9920
Epoch 24/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0492 - acc: 0.9891 - val_loss: 0.0406 - val_acc: 0.9922
Epoch 25/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0494 - acc: 0.9892 - val_loss: 0.0398 - val_acc: 0.9924
Epoch 26/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0473 - acc: 0.9896 - val_loss: 0.0390 - val_acc: 0.9924
Epoch 27/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0466 - acc: 0.9899 - val_loss: 0.0383 - val_acc: 0.9926
Epoch 28/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0463 - acc: 0.9888 - val_loss: 0.0376 - val_acc: 0.9926
Epoch 29/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0462 - acc: 0.9891 - val_loss: 0.0371 - val_acc: 0.9926
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0448 - acc: 0.9894 - val_loss: 0.0364 - val_acc: 0.9926
Epoch 31/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0431 - acc: 0.9897 - val_loss: 0.0359 - val_acc: 0.9926
Epoch 32/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0424 - acc: 0.9905 - val_loss: 0.0353 - val_acc: 0.9926
Epoch 33/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0428 - acc: 0.9902 - val_loss: 0.0348 - val_acc: 0.9928
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0419 - acc: 0.9902 - val_loss: 0.0343 - val_acc: 0.9928
Epoch 35/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0414 - acc: 0.9897 - val_loss: 0.0339 - val_acc: 0.9928
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0404 - acc: 0.9905 - val_loss: 0.0334 - val_acc: 0.9928
Epoch 37/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0398 - acc: 0.9905 - val_loss: 0.0331 - val_acc: 0.9928
Epoch 38/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0386 - acc: 0.9912 - val_loss: 0.0327 - val_acc: 0.9928
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0384 - acc: 0.9909 - val_loss: 0.0323 - val_acc: 0.9928
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0384 - acc: 0.9901 - val_loss: 0.0319 - val_acc: 0.9928
Epoch 41/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0384 - acc: 0.9909 - val_loss: 0.0316 - val_acc: 0.9930
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0380 - acc: 0.9908 - val_loss: 0.0312 - val_acc: 0.9930
Epoch 43/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0368 - acc: 0.9908 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0370 - acc: 0.9903 - val_loss: 0.0306 - val_acc: 0.9930
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0354 - acc: 0.9919 - val_loss: 0.0303 - val_acc: 0.9930
Epoch 46/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0358 - acc: 0.9909 - val_loss: 0.0301 - val_acc: 0.9930
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0358 - acc: 0.9905 - val_loss: 0.0298 - val_acc: 0.9932
Epoch 48/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0347 - acc: 0.9923 - val_loss: 0.0295 - val_acc: 0.9934
Epoch 49/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0351 - acc: 0.9909 - val_loss: 0.0293 - val_acc: 0.9934
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0347 - acc: 0.9911 - val_loss: 0.0290 - val_acc: 0.9938
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0332 - acc: 0.9923 - val_loss: 0.0288 - val_acc: 0.9936
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0349 - acc: 0.9916 - val_loss: 0.0285 - val_acc: 0.9938
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0338 - acc: 0.9918 - val_loss: 0.0283 - val_acc: 0.9938
Epoch 54/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0333 - acc: 0.9914 - val_loss: 0.0281 - val_acc: 0.9938
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0320 - acc: 0.9922 - val_loss: 0.0279 - val_acc: 0.9938
Epoch 56/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0329 - acc: 0.9918 - val_loss: 0.0277 - val_acc: 0.9938
Epoch 57/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0319 - acc: 0.9919 - val_loss: 0.0275 - val_acc: 0.9940
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0331 - acc: 0.9913 - val_loss: 0.0273 - val_acc: 0.9940
Epoch 59/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0321 - acc: 0.9916 - val_loss: 0.0271 - val_acc: 0.9940
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0322 - acc: 0.9920 - val_loss: 0.0269 - val_acc: 0.9940
Epoch 61/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0267 - val_acc: 0.9940
Epoch 62/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9922 - val_loss: 0.0266 - val_acc: 0.9940
Epoch 63/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0315 - acc: 0.9920 - val_loss: 0.0265 - val_acc: 0.9942
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0263 - val_acc: 0.9940
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0315 - acc: 0.9916 - val_loss: 0.0261 - val_acc: 0.9940
Epoch 66/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0311 - acc: 0.9919 - val_loss: 0.0260 - val_acc: 0.9940
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9924 - val_loss: 0.0258 - val_acc: 0.9940
Epoch 68/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0314 - acc: 0.9917 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 69/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0297 - acc: 0.9919 - val_loss: 0.0255 - val_acc: 0.9940
Epoch 70/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0304 - acc: 0.9920 - val_loss: 0.0254 - val_acc: 0.9942
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0298 - acc: 0.9916 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0303 - acc: 0.9920 - val_loss: 0.0251 - val_acc: 0.9942
Epoch 73/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0290 - acc: 0.9923 - val_loss: 0.0250 - val_acc: 0.9942
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0295 - acc: 0.9921 - val_loss: 0.0249 - val_acc: 0.9942
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0288 - acc: 0.9923 - val_loss: 0.0248 - val_acc: 0.9942
Epoch 76/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9942
Epoch 77/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0287 - acc: 0.9925 - val_loss: 0.0246 - val_acc: 0.9944
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0285 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9944
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9927 - val_loss: 0.0243 - val_acc: 0.9944
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0267 - acc: 0.9937 - val_loss: 0.0243 - val_acc: 0.9944
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9922 - val_loss: 0.0241 - val_acc: 0.9946
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0278 - acc: 0.9925 - val_loss: 0.0240 - val_acc: 0.9946
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9946
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0279 - acc: 0.9926 - val_loss: 0.0238 - val_acc: 0.9948
Epoch 85/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0278 - acc: 0.9926 - val_loss: 0.0237 - val_acc: 0.9948
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0270 - acc: 0.9927 - val_loss: 0.0236 - val_acc: 0.9946
Epoch 87/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9933 - val_loss: 0.0235 - val_acc: 0.9950
Epoch 88/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0276 - acc: 0.9927 - val_loss: 0.0234 - val_acc: 0.9950
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0269 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9946
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0266 - acc: 0.9926 - val_loss: 0.0233 - val_acc: 0.9950
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0266 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9950
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0263 - acc: 0.9932 - val_loss: 0.0231 - val_acc: 0.9950
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0256 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 94/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0258 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9948
Epoch 95/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0268 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9950
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0262 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9950
Epoch 97/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9929 - val_loss: 0.0227 - val_acc: 0.9950
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0266 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9950
Epoch 99/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9950
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 101/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0261 - acc: 0.9924 - val_loss: 0.0224 - val_acc: 0.9950
Epoch 102/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0260 - acc: 0.9930 - val_loss: 0.0223 - val_acc: 0.9950
Epoch 103/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0263 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 104/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0222 - val_acc: 0.9950
Epoch 105/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0258 - acc: 0.9930 - val_loss: 0.0221 - val_acc: 0.9950
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0247 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9927 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 108/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0243 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9952
Epoch 109/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9952
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9952
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0246 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9952
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9952
Epoch 113/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9952
Epoch 114/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0247 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9952
Epoch 115/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9952
Epoch 116/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9952
Epoch 117/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0214 - val_acc: 0.9952
Epoch 118/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0239 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9954
Epoch 119/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0251 - acc: 0.9924 - val_loss: 0.0213 - val_acc: 0.9954
Epoch 120/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9954
Epoch 121/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9954
Epoch 122/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0242 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9954
Epoch 123/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9954
Epoch 124/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9935 - val_loss: 0.0210 - val_acc: 0.9954
Epoch 125/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0210 - val_acc: 0.9954
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9954
Epoch 127/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0236 - acc: 0.9930 - val_loss: 0.0209 - val_acc: 0.9954
Epoch 128/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0240 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9954
Epoch 129/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9954
Epoch 130/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9954
Epoch 131/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9934 - val_loss: 0.0207 - val_acc: 0.9954
Epoch 132/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9954
Epoch 133/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9935 - val_loss: 0.0206 - val_acc: 0.9954
Epoch 134/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9954
Epoch 135/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9938 - val_loss: 0.0205 - val_acc: 0.9954
Epoch 136/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9954
Epoch 137/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0204 - val_acc: 0.9954
Epoch 138/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9932 - val_loss: 0.0204 - val_acc: 0.9954
Epoch 139/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9954
Epoch 140/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9954
Epoch 141/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9930 - val_loss: 0.0203 - val_acc: 0.9954
Epoch 142/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0202 - val_acc: 0.9954
Epoch 143/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0202 - val_acc: 0.9954
Epoch 144/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9954
Epoch 145/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9932 - val_loss: 0.0201 - val_acc: 0.9954
Epoch 146/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0232 - acc: 0.9935 - val_loss: 0.0200 - val_acc: 0.9952
Epoch 147/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0200 - val_acc: 0.9952
Epoch 148/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0216 - acc: 0.9949 - val_loss: 0.0200 - val_acc: 0.9952
Epoch 149/500
20000/20000 [==============================] - 1s 54us/step - loss: 0.0215 - acc: 0.9942 - val_loss: 0.0199 - val_acc: 0.9952
Epoch 150/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0199 - val_acc: 0.9952
Epoch 151/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9952
Epoch 152/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0198 - val_acc: 0.9952
Epoch 153/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0198 - val_acc: 0.9952
Epoch 154/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0198 - val_acc: 0.9952
Epoch 155/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0197 - val_acc: 0.9952
Epoch 156/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9952
Epoch 157/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 158/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 159/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 160/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9941 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 161/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0223 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 162/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 163/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 164/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0220 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9954
Epoch 165/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9944 - val_loss: 0.0194 - val_acc: 0.9954
Epoch 166/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9954
Epoch 167/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 168/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 169/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 170/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 171/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 172/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9939 - val_loss: 0.0192 - val_acc: 0.9954
Epoch 173/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9954
Epoch 174/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9954
Epoch 175/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9954
Epoch 176/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9954
Epoch 177/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9954
Epoch 178/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9954
Epoch 179/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9954
Epoch 180/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9954
Epoch 181/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9954
Epoch 182/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9954
Epoch 183/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9954
Epoch 184/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9954
Epoch 185/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9954
Epoch 186/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0189 - val_acc: 0.9954
Epoch 187/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9954
Epoch 188/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9954
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0188 - val_acc: 0.9954
Epoch 190/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9954
Epoch 191/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9954
Epoch 192/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9943 - val_loss: 0.0187 - val_acc: 0.9954
Epoch 193/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9942 - val_loss: 0.0187 - val_acc: 0.9954
Epoch 194/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9954
Epoch 195/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 196/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 197/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 198/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 199/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 200/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 201/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0206 - acc: 0.9945 - val_loss: 0.0186 - val_acc: 0.9954
Epoch 202/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9954
Epoch 203/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0185 - val_acc: 0.9954
Epoch 204/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9954
Epoch 205/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0184 - val_acc: 0.9954
Epoch 206/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0184 - val_acc: 0.9954
Epoch 207/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0184 - val_acc: 0.9954
Epoch 208/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 209/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 210/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 211/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 212/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0203 - acc: 0.9943 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 213/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 214/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9948 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 215/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9954
Epoch 216/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9954
Epoch 217/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9944 - val_loss: 0.0182 - val_acc: 0.9954
Epoch 218/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9954
Epoch 219/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0194 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9954
Epoch 220/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 221/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 222/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0191 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 223/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0206 - acc: 0.9941 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 224/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 225/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0181 - val_acc: 0.9954
Epoch 226/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 227/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 228/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 229/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9945 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 230/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0196 - acc: 0.9943 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 231/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 232/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0200 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 233/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0191 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 234/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 235/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9943 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 236/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 237/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 238/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9954
Epoch 239/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9954
Epoch 240/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0186 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9956
Epoch 241/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9956
Epoch 242/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9956
Epoch 243/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 244/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 245/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 246/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 247/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 248/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 249/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9948 - val_loss: 0.0177 - val_acc: 0.9956
Epoch 250/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 251/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 252/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 253/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 254/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 255/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 256/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 257/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 258/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 259/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9956
Epoch 260/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 261/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 262/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 263/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 264/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 265/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 266/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9956
Epoch 267/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 268/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 269/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 270/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 271/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 272/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0185 - acc: 0.9948 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 273/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 274/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0192 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9956
Epoch 275/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9943 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 276/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 277/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 278/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 279/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 280/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 281/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 282/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 283/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9949 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 284/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 285/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9956
Epoch 286/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 287/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 288/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0184 - acc: 0.9950 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 289/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 290/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9949 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 291/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 292/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0172 - val_acc: 0.9956
Epoch 293/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 294/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9943 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 295/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 296/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 297/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0183 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 298/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 299/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 300/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9956
Epoch 301/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 302/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 303/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9951 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 304/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 305/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9948 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 306/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 307/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 308/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 309/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 310/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 311/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9951 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 312/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0178 - acc: 0.9939 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 313/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 314/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 315/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 316/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 317/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 318/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 319/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 320/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 321/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 322/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 323/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 324/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 325/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 326/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 327/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 328/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0173 - acc: 0.9953 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 329/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 330/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 331/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 332/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 333/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 334/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 335/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 336/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 337/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 338/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 339/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0167 - val_acc: 0.9958
Epoch 340/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9958
Epoch 341/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9958
Epoch 342/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9958
Epoch 343/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0167 - val_acc: 0.9958
Epoch 344/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0167 - val_acc: 0.9958
Epoch 345/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 346/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 347/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9953 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 348/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 349/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 350/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 351/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 352/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 353/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 354/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0176 - acc: 0.9952 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 355/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 356/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 357/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 358/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 359/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 360/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 361/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 362/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0172 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 363/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 364/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 365/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 366/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 367/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 368/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 369/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 370/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 371/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 372/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 373/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9958
Epoch 374/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 375/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 376/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 377/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 378/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 379/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 380/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 381/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 382/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9956 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 383/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 384/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 385/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 386/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 387/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 388/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 389/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 390/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9958
Epoch 391/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9958
Epoch 392/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9943 - val_loss: 0.0163 - val_acc: 0.9958
Epoch 393/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 394/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 395/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 396/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 397/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 398/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 399/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0167 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 400/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 401/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 402/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0168 - acc: 0.9956 - val_loss: 0.0163 - val_acc: 0.9960
Epoch 403/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 404/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 405/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 406/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 407/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 408/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 409/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 410/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 411/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 412/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 413/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 414/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 415/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 416/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 417/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 418/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 419/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 420/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0160 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 421/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 422/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 423/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9958
Epoch 424/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 425/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 426/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9960
Epoch 427/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9960
Epoch 428/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9960
Epoch 429/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9960
Epoch 430/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 431/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 432/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 433/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 434/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9960
Epoch 435/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 436/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 437/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9959 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 438/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 439/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 440/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 441/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9962
Epoch 442/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 443/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 444/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 445/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 446/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9960
Epoch 447/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9960
Epoch 448/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 449/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9960
Epoch 450/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0160 - val_acc: 0.9960
Epoch 451/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0160 - val_acc: 0.9960
Epoch 452/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 453/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 454/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 455/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 456/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 457/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 458/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 459/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 460/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 461/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 462/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0153 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 463/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 464/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 465/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 466/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0153 - acc: 0.9952 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 467/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9962
Epoch 468/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9957 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 469/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 470/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 471/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 472/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0155 - acc: 0.9949 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 473/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 474/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 475/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9957 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 476/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 477/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 478/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 479/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9948 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 480/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 481/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 482/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 483/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 484/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0149 - acc: 0.9959 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 485/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 486/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 487/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 488/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 489/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9958 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 490/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0159 - val_acc: 0.9962
Epoch 491/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 492/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 493/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 494/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 495/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 496/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 497/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 498/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 499/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0158 - val_acc: 0.9962
Epoch 500/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0158 - val_acc: 0.9962
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.9 , decay:0.0001
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 64us/step - loss: 0.2803 - acc: 0.8982 - val_loss: 0.1013 - val_acc: 0.9868
Epoch 2/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0942 - acc: 0.9829 - val_loss: 0.0668 - val_acc: 0.9882
Epoch 3/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0690 - acc: 0.9860 - val_loss: 0.0540 - val_acc: 0.9894
Epoch 4/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0564 - acc: 0.9886 - val_loss: 0.0469 - val_acc: 0.9900
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0515 - acc: 0.9883 - val_loss: 0.0420 - val_acc: 0.9908
Epoch 6/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0463 - acc: 0.9894 - val_loss: 0.0386 - val_acc: 0.9912
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0438 - acc: 0.9899 - val_loss: 0.0360 - val_acc: 0.9920
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0400 - acc: 0.9901 - val_loss: 0.0343 - val_acc: 0.9912
Epoch 9/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0381 - acc: 0.9913 - val_loss: 0.0326 - val_acc: 0.9924
Epoch 10/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0356 - acc: 0.9910 - val_loss: 0.0313 - val_acc: 0.9928
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0355 - acc: 0.9907 - val_loss: 0.0301 - val_acc: 0.9932
Epoch 12/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0335 - acc: 0.9910 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0326 - acc: 0.9918 - val_loss: 0.0281 - val_acc: 0.9940
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9934
Epoch 15/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0297 - acc: 0.9920 - val_loss: 0.0271 - val_acc: 0.9934
Epoch 16/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0292 - acc: 0.9920 - val_loss: 0.0263 - val_acc: 0.9936
Epoch 17/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0296 - acc: 0.9917 - val_loss: 0.0257 - val_acc: 0.9940
Epoch 18/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0277 - acc: 0.9924 - val_loss: 0.0254 - val_acc: 0.9936
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0274 - acc: 0.9927 - val_loss: 0.0250 - val_acc: 0.9938
Epoch 20/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9925 - val_loss: 0.0245 - val_acc: 0.9940
Epoch 21/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0268 - acc: 0.9922 - val_loss: 0.0241 - val_acc: 0.9940
Epoch 22/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0261 - acc: 0.9925 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 23/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0255 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 24/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0259 - acc: 0.9927 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 25/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9940
Epoch 26/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0228 - val_acc: 0.9942
Epoch 27/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0247 - acc: 0.9924 - val_loss: 0.0225 - val_acc: 0.9942
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0224 - val_acc: 0.9942
Epoch 29/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9942
Epoch 30/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9948
Epoch 31/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9944
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 35/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9948
Epoch 37/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9944
Epoch 38/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0218 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 39/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9948
Epoch 40/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0219 - acc: 0.9942 - val_loss: 0.0205 - val_acc: 0.9948
Epoch 41/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0214 - acc: 0.9944 - val_loss: 0.0204 - val_acc: 0.9948
Epoch 42/500
20000/20000 [==============================] - 1s 51us/step - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 43/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9948
Epoch 44/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 46/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 47/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9948
Epoch 48/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0209 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9948
Epoch 49/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0197 - val_acc: 0.9948
Epoch 50/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9948
Epoch 51/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 54/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 56/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 58/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 61/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 62/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 63/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 64/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 68/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 69/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 70/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 73/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0182 - acc: 0.9950 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 74/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 75/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 76/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 77/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 79/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 86/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 87/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 88/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0188 - acc: 0.9943 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9953 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 94/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 95/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 96/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 97/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 98/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0178 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 99/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 101/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 102/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 103/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 104/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 105/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0171 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 108/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 109/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 110/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 111/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 112/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 113/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 114/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 115/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 116/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 117/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 118/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 119/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 120/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 121/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 122/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 123/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 124/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 125/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 126/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 127/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 128/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 129/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0173 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 130/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 131/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 132/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 133/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 134/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 135/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 136/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 137/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0176 - acc: 0.9943 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 138/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 139/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 140/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 141/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 142/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0162 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 143/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9952 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 144/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 145/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 146/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0166 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 147/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 148/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 149/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 150/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0161 - acc: 0.9944 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 151/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 152/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 153/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 154/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 155/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9952 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 156/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 157/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 158/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 159/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 160/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0158 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 161/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9956 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 162/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 163/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 164/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 165/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 166/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 167/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 168/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 169/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 170/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 171/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 172/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 173/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 174/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 175/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 176/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 177/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0159 - acc: 0.9956 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 178/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 179/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 180/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 181/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 182/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 183/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 184/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 185/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 186/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0145 - acc: 0.9957 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 187/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 188/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 189/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 190/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0154 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 191/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 192/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 193/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 194/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0156 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 195/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9956 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 196/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9959 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 197/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 198/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 199/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 200/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 201/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 202/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 203/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 204/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9958 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 205/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 206/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0146 - acc: 0.9959 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 207/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 208/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 209/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 210/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 211/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 212/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0153 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 213/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 214/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 00214: early stopping
The model is not good enough

Start Fitting Merged Model InceptionV3_Xception_ResNet50_InceptionResNetV2 
with lr:0.0001 ,mom:0.9 , decay:1e-05
[(2048,), (2048,), (2048,), (1536,)]
model created
Train on 20000 samples, validate on 5000 samples
Epoch 1/500
20000/20000 [==============================] - 1s 62us/step - loss: 0.2490 - acc: 0.9209 - val_loss: 0.1028 - val_acc: 0.9844
Epoch 2/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0968 - acc: 0.9810 - val_loss: 0.0684 - val_acc: 0.9870
Epoch 3/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0704 - acc: 0.9850 - val_loss: 0.0549 - val_acc: 0.9900
Epoch 4/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0594 - acc: 0.9869 - val_loss: 0.0476 - val_acc: 0.9904
Epoch 5/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0510 - acc: 0.9886 - val_loss: 0.0426 - val_acc: 0.9914
Epoch 6/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0464 - acc: 0.9887 - val_loss: 0.0392 - val_acc: 0.9916
Epoch 7/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0421 - acc: 0.9900 - val_loss: 0.0364 - val_acc: 0.9930
Epoch 8/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0399 - acc: 0.9905 - val_loss: 0.0343 - val_acc: 0.9930
Epoch 9/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0369 - acc: 0.9905 - val_loss: 0.0327 - val_acc: 0.9936
Epoch 10/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0376 - acc: 0.9908 - val_loss: 0.0314 - val_acc: 0.9936
Epoch 11/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0347 - acc: 0.9912 - val_loss: 0.0302 - val_acc: 0.9940
Epoch 12/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0338 - acc: 0.9909 - val_loss: 0.0292 - val_acc: 0.9940
Epoch 13/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0330 - acc: 0.9910 - val_loss: 0.0283 - val_acc: 0.9944
Epoch 14/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0304 - acc: 0.9915 - val_loss: 0.0275 - val_acc: 0.9944
Epoch 15/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0299 - acc: 0.9917 - val_loss: 0.0268 - val_acc: 0.9944
Epoch 16/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0296 - acc: 0.9915 - val_loss: 0.0262 - val_acc: 0.9944
Epoch 17/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0273 - acc: 0.9928 - val_loss: 0.0256 - val_acc: 0.9944
Epoch 18/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0281 - acc: 0.9923 - val_loss: 0.0251 - val_acc: 0.9944
Epoch 19/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0275 - acc: 0.9920 - val_loss: 0.0247 - val_acc: 0.9944
Epoch 20/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0264 - acc: 0.9928 - val_loss: 0.0243 - val_acc: 0.9944
Epoch 21/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0259 - acc: 0.9924 - val_loss: 0.0239 - val_acc: 0.9946
Epoch 22/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0264 - acc: 0.9922 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 23/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 24/500
20000/20000 [==============================] - 1s 52us/step - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 25/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0247 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9952
Epoch 26/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0224 - val_acc: 0.9952
Epoch 27/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0222 - val_acc: 0.9952
Epoch 28/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0219 - val_acc: 0.9952
Epoch 29/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9952
Epoch 30/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9952
Epoch 31/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9952
Epoch 32/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9952
Epoch 33/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0211 - val_acc: 0.9952
Epoch 34/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9952
Epoch 35/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0207 - val_acc: 0.9952
Epoch 36/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0211 - acc: 0.9944 - val_loss: 0.0206 - val_acc: 0.9950
Epoch 37/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0202 - val_acc: 0.9952
Epoch 38/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0203 - val_acc: 0.9950
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0201 - val_acc: 0.9952
Epoch 40/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0208 - acc: 0.9939 - val_loss: 0.0199 - val_acc: 0.9952
Epoch 41/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0198 - val_acc: 0.9952
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 43/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 45/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 46/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0201 - acc: 0.9943 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 47/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0193 - val_acc: 0.9950
Epoch 48/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 49/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 50/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0196 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 51/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0191 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 52/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 53/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 54/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0186 - val_acc: 0.9950
Epoch 55/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0190 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9950
Epoch 56/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 57/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9950 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 58/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 59/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 61/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 62/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 63/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9952
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0180 - val_acc: 0.9952
Epoch 65/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 66/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 67/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 68/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0183 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9954
Epoch 69/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 70/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9952
Epoch 71/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9954
Epoch 72/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 73/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 74/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9954
Epoch 75/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9954
Epoch 76/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9954
Epoch 77/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9954
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 79/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 80/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 81/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9954
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0173 - val_acc: 0.9954
Epoch 85/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9952 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 86/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 87/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 88/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 89/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9954
Epoch 92/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 93/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 94/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 95/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9956 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 97/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 99/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 101/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 102/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 103/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 104/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 105/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 107/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 108/500
20000/20000 [==============================] - 1s 50us/step - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 35/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 36/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 37/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 38/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0204 - acc: 0.9934 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 39/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 40/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0202 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 41/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0202 - acc: 0.9943 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 42/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 43/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0205 - acc: 0.9933 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 44/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 45/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0192 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 46/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 47/500
20000/20000 [==============================] - 1s 49us/step - loss: 0.0198 - acc: 0.9943 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 48/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 49/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 50/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0184 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9944
Epoch 51/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0179 - val_acc: 0.9944
Epoch 52/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9944
Epoch 53/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0178 - val_acc: 0.9944
Epoch 54/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9944
Epoch 55/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0192 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9944
Epoch 56/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9944
Epoch 57/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0176 - val_acc: 0.9944
Epoch 58/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0183 - acc: 0.9938 - val_loss: 0.0175 - val_acc: 0.9944
Epoch 59/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 60/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 61/500
20000/20000 [==============================] - 1s 47us/step - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 62/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 63/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 64/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 65/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 66/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 67/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 68/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 69/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0167 - acc: 0.9953 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 70/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 71/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0177 - acc: 0.9939 - val_loss: 0.0169 - val_acc: 0.9942
Epoch 72/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0171 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 73/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 74/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 75/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0168 - val_acc: 0.9944
Epoch 76/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0172 - acc: 0.9940 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 77/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 78/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 79/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 80/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 81/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 82/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 83/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 84/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 85/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 86/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 87/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 88/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 89/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 90/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 91/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 92/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 93/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 94/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 95/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 96/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9944
Epoch 97/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0161 - val_acc: 0.9944
Epoch 98/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 99/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9944
Epoch 100/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 101/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9944
Epoch 102/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 103/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 104/500
20000/20000 [==============================] - 1s 53us/step - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 105/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0159 - val_acc: 0.9944
Epoch 106/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0159 - val_acc: 0.9944
Epoch 107/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0159 - acc: 0.9948 - val_loss: 0.0159 - val_acc: 0.9944
Epoch 108/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0158 - val_acc: 0.9946
Epoch 109/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0158 - val_acc: 0.9946
Epoch 110/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9944
Epoch 111/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0158 - val_acc: 0.9944
Epoch 112/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 113/500
20000/20000 [==============================] - 1s 48us/step - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0158 - val_acc: 0.9944
Epoch 114/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0142 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9944
Epoch 115/500
20000/20000 [==============================] - 1s 46us/step - loss: 0.0153 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9946
Epoch 116/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 117/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0150 - acc: 0.9956 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 118/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0157 - val_acc: 0.9944
Epoch 119/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0157 - val_acc: 0.9944
Epoch 120/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0157 - val_acc: 0.9944
Epoch 121/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0157 - val_acc: 0.9944
Epoch 122/500
20000/20000 [==============================] - 1s 45us/step - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0157 - val_acc: 0.9944
Epoch 123/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0157 - val_acc: 0.9944
Epoch 124/500
20000/20000 [==============================] - 1s 44us/step - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 00124: early stopping
The model is not good enough

合并模型训练 - HyperOpt 参数优化

In [20]:
from kopt import CompileFN, KMongoTrials, test_fn
from hyperopt import fmin, tpe, hp, STATUS_OK, Trials
from helper import *




# 1. define the data function returning training, (validation, test) data

def data(model_name):
    Xs,y = load_train_data_merge(model_name)
    return (Xs, y),


# 2. Define the model function returning a compiled Keras model
def model(train_data, lr=0.001, decay=0, momentum=0.9, dropout=0.5):
    # extract data dimensions
    input_shapes = []
    Xs = train_data[0]
    for feature in Xs:
        input_shapes.append(feature.shape[1:])

    sgd = SGD(lr=lr, decay=decay, momentum=momentum)
    model = create_model_merge_output(input_shapes, optimizer=sgd, drop_rate=dropout)

    return model


# Specify the optimization metrics
db_name = "merge"
exp_name = "myexp1"
objective = CompileFN(db_name, exp_name,
                      data_fn=data,
                      model_fn=model,
                      loss_metric="loss", # which metric to optimize for
                      loss_metric_mode="min",  # try to maximize the metric
                      valid_split=.2,  # use 20% of the training data for the validation set
                      save_dir="model/hyperopt/")  # place to store the models

# define the hyper-parameter ranges
# see https://github.com/hyperopt/hyperopt/wiki/FMin for more info
hyper_params = {
    "data": {
        "model_name" : ["InceptionV3","Xception","ResNet50"]
    },
        "model": {
        "lr": hp.loguniform("m_lr", np.log(0.001), np.log(0.01)),  # 0.001 - 0.01
        "decay": hp.uniform("m_decay", 0.0005, 0.05),
        "momentum": hp.choice("m_mom", [0.5,0.8,0.9]),
        "dropout": hp.choice("m_do",[ 0.1,0.2,0.3,0.5]),
    },
    "fit": {
        "batch_size": 128,
        "epochs": 200,
        "patience": 8
    }
}

【注意该块代码需要4~5小时运行时间,如需简单验证请调低fmin函数的max_evals函数,以便减少需要测试的超参数组合数】

In [21]:
# test_fn(objective, hyper_params)

trials = Trials()
best = fmin(objective, hyper_params, trials=trials, algo=tpe.suggest, max_evals=100)
2018-03-27 08:47:19,462 [INFO] tpe_transform took 0.002131 seconds
2018-03-27 08:47:19,463 [INFO] TPE using 0 trials
2018-03-27 08:47:19,466 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 08:47:20,464 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0450 - acc: 0.9824 - val_loss: 0.0288 - val_acc: 0.9900
Epoch 2/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0272 - val_acc: 0.9912
Epoch 3/200
 - 1s - loss: 0.0211 - acc: 0.9928 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 4/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0265 - val_acc: 0.9914
Epoch 5/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0262 - val_acc: 0.9918
Epoch 6/200
 - 1s - loss: 0.0199 - acc: 0.9936 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 7/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 8/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0257 - val_acc: 0.9920
Epoch 9/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 10/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 11/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0255 - val_acc: 0.9920
Epoch 12/200
 - 1s - loss: 0.0182 - acc: 0.9939 - val_loss: 0.0255 - val_acc: 0.9920
Epoch 13/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0254 - val_acc: 0.9920
Epoch 14/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 15/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 16/200
 - 1s - loss: 0.0184 - acc: 0.9939 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0251 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0251 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0251 - val_acc: 0.9920
Epoch 22/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0250 - val_acc: 0.9920
Epoch 23/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0250 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0250 - val_acc: 0.9920
Epoch 25/200
 - 1s - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0249 - val_acc: 0.9920
Epoch 26/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0249 - val_acc: 0.9920
Epoch 27/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0249 - val_acc: 0.9920
Epoch 28/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0249 - val_acc: 0.9920
Epoch 29/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0249 - val_acc: 0.9918
Epoch 30/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0248 - val_acc: 0.9918
Epoch 31/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0248 - val_acc: 0.9918
Epoch 32/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0248 - val_acc: 0.9918
Epoch 33/200
 - 1s - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0248 - val_acc: 0.9920
Epoch 34/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0248 - val_acc: 0.9920
Epoch 35/200
 - 1s - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0248 - val_acc: 0.9920
Epoch 36/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0247 - val_acc: 0.9918
Epoch 37/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0247 - val_acc: 0.9918
Epoch 38/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0247 - val_acc: 0.9920
Epoch 39/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0247 - val_acc: 0.9918
Epoch 40/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0247 - val_acc: 0.9918
Epoch 41/200
 - 1s - loss: 0.0170 - acc: 0.9953 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 42/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 43/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 44/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 45/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 46/200
 - 1s - loss: 0.0167 - acc: 0.9953 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 47/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 48/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 49/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 50/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 51/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 52/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 53/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 54/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 55/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 56/200
 - 1s - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 57/200
 - 1s - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 58/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 59/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 60/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 61/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 62/200
 - 1s - loss: 0.0168 - acc: 0.9954 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 63/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 64/200
 - 1s - loss: 0.0168 - acc: 0.9953 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 65/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 66/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 67/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 68/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 69/200
 - 1s - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 70/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 71/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 72/200
 - 1s - loss: 0.0168 - acc: 0.9952 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 73/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 74/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 75/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 76/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 77/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0244 - val_acc: 0.9918
Epoch 78/200
 - 1s - loss: 0.0168 - acc: 0.9952 - val_loss: 0.0244 - val_acc: 0.9918
Epoch 79/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0243 - val_acc: 0.9916
Epoch 80/200
 - 1s - loss: 0.0166 - acc: 0.9954 - val_loss: 0.0243 - val_acc: 0.9916
Epoch 81/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 82/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 83/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 84/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 85/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 86/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 87/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 88/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 89/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 90/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 91/200
 - 1s - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 92/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 93/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 94/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 95/200
 - 1s - loss: 0.0167 - acc: 0.9952 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 96/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 97/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 98/200
 - 1s - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 99/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 100/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 101/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 102/200
 - 1s - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 103/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 104/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 105/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 106/200
 - 1s - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 107/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 108/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 109/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 110/200
 - 1s - loss: 0.0165 - acc: 0.9945 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 111/200
 - 1s - loss: 0.0164 - acc: 0.9952 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 112/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 113/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 114/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 115/200
 - 1s - loss: 0.0167 - acc: 0.9954 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 116/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 117/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 118/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 119/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 120/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 121/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 122/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 123/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 124/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 125/200
 - 1s - loss: 0.0169 - acc: 0.9951 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 126/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 127/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 128/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 129/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 130/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 131/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 132/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 133/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 134/200
 - 1s - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 135/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 136/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 137/200
 - 1s - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 138/200
 - 1s - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 139/200
 - 1s - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 140/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 141/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 142/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 143/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 144/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 145/200
 - 1s - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 146/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 147/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 148/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 149/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 150/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 151/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 152/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 153/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 154/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 155/200
 - 1s - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 156/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 157/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 158/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 159/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 160/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 161/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 162/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 163/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 164/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 165/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 166/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 167/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 168/200
 - 1s - loss: 0.0163 - acc: 0.9954 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 169/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 170/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 171/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 172/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 173/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 174/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 175/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 176/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 177/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 178/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 179/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 180/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 181/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 182/200
 - 1s - loss: 0.0163 - acc: 0.9954 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 183/200
 - 1s - loss: 0.0162 - acc: 0.9955 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 184/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 185/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 186/200
 - 1s - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 187/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 188/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 189/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 190/200
 - 1s - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 191/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 192/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 193/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 194/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 195/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 196/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 197/200
 - 1s - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 198/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 199/200
 - 1s - loss: 0.0162 - acc: 0.9955 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 200/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0239 - val_acc: 0.9918
2018-03-27 08:50:17,074 [INFO] Evaluate...
2018-03-27 08:50:18,582 [INFO] Done!
2018-03-27 08:50:18,588 [INFO] tpe_transform took 0.002450 seconds
2018-03-27 08:50:18,589 [INFO] TPE using 1/1 trials with best loss 0.023905
2018-03-27 08:50:18,591 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 08:50:19,583 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.1502 - acc: 0.9624 - val_loss: 0.0859 - val_acc: 0.9864
Epoch 2/200
 - 1s - loss: 0.0856 - acc: 0.9832 - val_loss: 0.0747 - val_acc: 0.9866
Epoch 3/200
 - 1s - loss: 0.0785 - acc: 0.9834 - val_loss: 0.0698 - val_acc: 0.9870
Epoch 4/200
 - 1s - loss: 0.0736 - acc: 0.9838 - val_loss: 0.0668 - val_acc: 0.9868
Epoch 5/200
 - 1s - loss: 0.0710 - acc: 0.9841 - val_loss: 0.0648 - val_acc: 0.9868
Epoch 6/200
 - 1s - loss: 0.0697 - acc: 0.9849 - val_loss: 0.0632 - val_acc: 0.9872
Epoch 7/200
 - 1s - loss: 0.0678 - acc: 0.9850 - val_loss: 0.0620 - val_acc: 0.9872
Epoch 8/200
 - 1s - loss: 0.0665 - acc: 0.9853 - val_loss: 0.0610 - val_acc: 0.9878
Epoch 9/200
 - 1s - loss: 0.0657 - acc: 0.9847 - val_loss: 0.0602 - val_acc: 0.9882
Epoch 10/200
 - 1s - loss: 0.0653 - acc: 0.9849 - val_loss: 0.0595 - val_acc: 0.9882
Epoch 11/200
 - 1s - loss: 0.0637 - acc: 0.9864 - val_loss: 0.0588 - val_acc: 0.9882
Epoch 12/200
 - 1s - loss: 0.0631 - acc: 0.9860 - val_loss: 0.0583 - val_acc: 0.9882
Epoch 13/200
 - 1s - loss: 0.0618 - acc: 0.9861 - val_loss: 0.0578 - val_acc: 0.9882
Epoch 14/200
 - 1s - loss: 0.0624 - acc: 0.9864 - val_loss: 0.0573 - val_acc: 0.9882
Epoch 15/200
 - 1s - loss: 0.0615 - acc: 0.9860 - val_loss: 0.0569 - val_acc: 0.9884
Epoch 16/200
 - 1s - loss: 0.0604 - acc: 0.9867 - val_loss: 0.0565 - val_acc: 0.9886
Epoch 17/200
 - 1s - loss: 0.0608 - acc: 0.9861 - val_loss: 0.0562 - val_acc: 0.9886
Epoch 18/200
 - 1s - loss: 0.0602 - acc: 0.9867 - val_loss: 0.0559 - val_acc: 0.9886
Epoch 19/200
 - 1s - loss: 0.0607 - acc: 0.9859 - val_loss: 0.0556 - val_acc: 0.9886
Epoch 20/200
 - 1s - loss: 0.0602 - acc: 0.9860 - val_loss: 0.0553 - val_acc: 0.9888
Epoch 21/200
 - 1s - loss: 0.0595 - acc: 0.9865 - val_loss: 0.0550 - val_acc: 0.9888
Epoch 22/200
 - 1s - loss: 0.0592 - acc: 0.9868 - val_loss: 0.0548 - val_acc: 0.9888
Epoch 23/200
 - 1s - loss: 0.0598 - acc: 0.9860 - val_loss: 0.0546 - val_acc: 0.9888
Epoch 24/200
 - 1s - loss: 0.0590 - acc: 0.9859 - val_loss: 0.0543 - val_acc: 0.9888
Epoch 25/200
 - 1s - loss: 0.0577 - acc: 0.9870 - val_loss: 0.0541 - val_acc: 0.9888
Epoch 26/200
 - 1s - loss: 0.0591 - acc: 0.9861 - val_loss: 0.0539 - val_acc: 0.9888
Epoch 27/200
 - 1s - loss: 0.0572 - acc: 0.9869 - val_loss: 0.0538 - val_acc: 0.9888
Epoch 28/200
 - 1s - loss: 0.0572 - acc: 0.9876 - val_loss: 0.0536 - val_acc: 0.9886
Epoch 29/200
 - 1s - loss: 0.0575 - acc: 0.9861 - val_loss: 0.0534 - val_acc: 0.9886
Epoch 30/200
 - 1s - loss: 0.0583 - acc: 0.9864 - val_loss: 0.0533 - val_acc: 0.9888
Epoch 31/200
 - 1s - loss: 0.0579 - acc: 0.9856 - val_loss: 0.0531 - val_acc: 0.9888
Epoch 32/200
 - 1s - loss: 0.0577 - acc: 0.9865 - val_loss: 0.0529 - val_acc: 0.9888
Epoch 33/200
 - 1s - loss: 0.0568 - acc: 0.9861 - val_loss: 0.0528 - val_acc: 0.9888
Epoch 34/200
 - 1s - loss: 0.0568 - acc: 0.9858 - val_loss: 0.0527 - val_acc: 0.9888
Epoch 35/200
 - 1s - loss: 0.0566 - acc: 0.9869 - val_loss: 0.0525 - val_acc: 0.9888
Epoch 36/200
 - 1s - loss: 0.0568 - acc: 0.9863 - val_loss: 0.0524 - val_acc: 0.9888
Epoch 37/200
 - 1s - loss: 0.0572 - acc: 0.9862 - val_loss: 0.0523 - val_acc: 0.9888
Epoch 38/200
 - 1s - loss: 0.0563 - acc: 0.9865 - val_loss: 0.0522 - val_acc: 0.9888
Epoch 39/200
 - 1s - loss: 0.0564 - acc: 0.9873 - val_loss: 0.0521 - val_acc: 0.9888
Epoch 40/200
 - 1s - loss: 0.0559 - acc: 0.9871 - val_loss: 0.0519 - val_acc: 0.9888
Epoch 41/200
 - 1s - loss: 0.0557 - acc: 0.9868 - val_loss: 0.0518 - val_acc: 0.9888
Epoch 42/200
 - 1s - loss: 0.0557 - acc: 0.9875 - val_loss: 0.0517 - val_acc: 0.9888
Epoch 43/200
 - 1s - loss: 0.0553 - acc: 0.9879 - val_loss: 0.0516 - val_acc: 0.9888
Epoch 44/200
 - 1s - loss: 0.0559 - acc: 0.9869 - val_loss: 0.0515 - val_acc: 0.9888
Epoch 45/200
 - 1s - loss: 0.0565 - acc: 0.9867 - val_loss: 0.0514 - val_acc: 0.9888
Epoch 46/200
 - 1s - loss: 0.0551 - acc: 0.9876 - val_loss: 0.0513 - val_acc: 0.9888
Epoch 47/200
 - 1s - loss: 0.0556 - acc: 0.9869 - val_loss: 0.0513 - val_acc: 0.9888
Epoch 48/200
 - 1s - loss: 0.0545 - acc: 0.9875 - val_loss: 0.0512 - val_acc: 0.9888
Epoch 49/200
 - 1s - loss: 0.0552 - acc: 0.9861 - val_loss: 0.0511 - val_acc: 0.9888
Epoch 50/200
 - 1s - loss: 0.0555 - acc: 0.9869 - val_loss: 0.0510 - val_acc: 0.9888
Epoch 51/200
 - 1s - loss: 0.0543 - acc: 0.9875 - val_loss: 0.0509 - val_acc: 0.9888
Epoch 52/200
 - 1s - loss: 0.0546 - acc: 0.9874 - val_loss: 0.0508 - val_acc: 0.9888
Epoch 53/200
 - 1s - loss: 0.0538 - acc: 0.9873 - val_loss: 0.0508 - val_acc: 0.9888
Epoch 54/200
 - 1s - loss: 0.0547 - acc: 0.9865 - val_loss: 0.0507 - val_acc: 0.9888
Epoch 55/200
 - 1s - loss: 0.0543 - acc: 0.9873 - val_loss: 0.0506 - val_acc: 0.9888
Epoch 56/200
 - 1s - loss: 0.0540 - acc: 0.9870 - val_loss: 0.0505 - val_acc: 0.9888
Epoch 57/200
 - 1s - loss: 0.0545 - acc: 0.9873 - val_loss: 0.0505 - val_acc: 0.9888
Epoch 58/200
 - 1s - loss: 0.0541 - acc: 0.9879 - val_loss: 0.0504 - val_acc: 0.9888
Epoch 59/200
 - 1s - loss: 0.0543 - acc: 0.9874 - val_loss: 0.0503 - val_acc: 0.9888
Epoch 60/200
 - 1s - loss: 0.0550 - acc: 0.9862 - val_loss: 0.0503 - val_acc: 0.9888
Epoch 61/200
 - 1s - loss: 0.0540 - acc: 0.9870 - val_loss: 0.0502 - val_acc: 0.9888
Epoch 62/200
 - 1s - loss: 0.0532 - acc: 0.9873 - val_loss: 0.0501 - val_acc: 0.9888
Epoch 63/200
 - 1s - loss: 0.0538 - acc: 0.9869 - val_loss: 0.0501 - val_acc: 0.9888
Epoch 64/200
 - 1s - loss: 0.0541 - acc: 0.9878 - val_loss: 0.0500 - val_acc: 0.9888
Epoch 65/200
 - 1s - loss: 0.0535 - acc: 0.9876 - val_loss: 0.0500 - val_acc: 0.9888
Epoch 66/200
 - 1s - loss: 0.0536 - acc: 0.9873 - val_loss: 0.0499 - val_acc: 0.9888
Epoch 67/200
 - 1s - loss: 0.0538 - acc: 0.9876 - val_loss: 0.0498 - val_acc: 0.9888
Epoch 68/200
 - 1s - loss: 0.0536 - acc: 0.9870 - val_loss: 0.0498 - val_acc: 0.9888
Epoch 69/200
 - 1s - loss: 0.0535 - acc: 0.9887 - val_loss: 0.0497 - val_acc: 0.9888
Epoch 70/200
 - 1s - loss: 0.0541 - acc: 0.9869 - val_loss: 0.0497 - val_acc: 0.9888
Epoch 71/200
 - 1s - loss: 0.0538 - acc: 0.9872 - val_loss: 0.0496 - val_acc: 0.9888
Epoch 72/200
 - 1s - loss: 0.0531 - acc: 0.9871 - val_loss: 0.0496 - val_acc: 0.9888
Epoch 73/200
 - 1s - loss: 0.0541 - acc: 0.9870 - val_loss: 0.0495 - val_acc: 0.9888
Epoch 74/200
 - 1s - loss: 0.0537 - acc: 0.9876 - val_loss: 0.0495 - val_acc: 0.9888
Epoch 75/200
 - 1s - loss: 0.0533 - acc: 0.9875 - val_loss: 0.0494 - val_acc: 0.9888
Epoch 76/200
 - 1s - loss: 0.0535 - acc: 0.9879 - val_loss: 0.0494 - val_acc: 0.9888
Epoch 77/200
 - 1s - loss: 0.0535 - acc: 0.9876 - val_loss: 0.0493 - val_acc: 0.9888
Epoch 78/200
 - 1s - loss: 0.0525 - acc: 0.9876 - val_loss: 0.0493 - val_acc: 0.9888
Epoch 79/200
 - 1s - loss: 0.0526 - acc: 0.9879 - val_loss: 0.0492 - val_acc: 0.9888
Epoch 80/200
 - 1s - loss: 0.0529 - acc: 0.9877 - val_loss: 0.0492 - val_acc: 0.9888
Epoch 81/200
 - 1s - loss: 0.0532 - acc: 0.9865 - val_loss: 0.0491 - val_acc: 0.9888
Epoch 82/200
 - 1s - loss: 0.0529 - acc: 0.9868 - val_loss: 0.0491 - val_acc: 0.9888
Epoch 83/200
 - 1s - loss: 0.0525 - acc: 0.9876 - val_loss: 0.0491 - val_acc: 0.9888
Epoch 84/200
 - 1s - loss: 0.0531 - acc: 0.9870 - val_loss: 0.0490 - val_acc: 0.9888
Epoch 85/200
 - 1s - loss: 0.0532 - acc: 0.9873 - val_loss: 0.0490 - val_acc: 0.9888
Epoch 86/200
 - 1s - loss: 0.0521 - acc: 0.9873 - val_loss: 0.0489 - val_acc: 0.9888
Epoch 87/200
 - 1s - loss: 0.0525 - acc: 0.9873 - val_loss: 0.0489 - val_acc: 0.9888
Epoch 88/200
 - 1s - loss: 0.0531 - acc: 0.9871 - val_loss: 0.0489 - val_acc: 0.9888
Epoch 89/200
 - 1s - loss: 0.0524 - acc: 0.9875 - val_loss: 0.0488 - val_acc: 0.9888
Epoch 90/200
 - 1s - loss: 0.0529 - acc: 0.9880 - val_loss: 0.0488 - val_acc: 0.9888
Epoch 91/200
 - 1s - loss: 0.0534 - acc: 0.9869 - val_loss: 0.0487 - val_acc: 0.9888
Epoch 92/200
 - 1s - loss: 0.0515 - acc: 0.9883 - val_loss: 0.0487 - val_acc: 0.9888
Epoch 93/200
 - 1s - loss: 0.0523 - acc: 0.9869 - val_loss: 0.0487 - val_acc: 0.9888
Epoch 94/200
 - 1s - loss: 0.0522 - acc: 0.9879 - val_loss: 0.0486 - val_acc: 0.9888
Epoch 95/200
 - 1s - loss: 0.0513 - acc: 0.9883 - val_loss: 0.0486 - val_acc: 0.9888
Epoch 96/200
 - 1s - loss: 0.0526 - acc: 0.9869 - val_loss: 0.0485 - val_acc: 0.9888
Epoch 97/200
 - 1s - loss: 0.0516 - acc: 0.9883 - val_loss: 0.0485 - val_acc: 0.9888
Epoch 98/200
 - 1s - loss: 0.0523 - acc: 0.9872 - val_loss: 0.0485 - val_acc: 0.9888
Epoch 99/200
 - 1s - loss: 0.0521 - acc: 0.9881 - val_loss: 0.0484 - val_acc: 0.9888
Epoch 100/200
 - 1s - loss: 0.0528 - acc: 0.9872 - val_loss: 0.0484 - val_acc: 0.9888
Epoch 101/200
 - 1s - loss: 0.0520 - acc: 0.9869 - val_loss: 0.0484 - val_acc: 0.9888
Epoch 102/200
 - 1s - loss: 0.0523 - acc: 0.9870 - val_loss: 0.0483 - val_acc: 0.9888
Epoch 103/200
 - 1s - loss: 0.0516 - acc: 0.9879 - val_loss: 0.0483 - val_acc: 0.9888
Epoch 104/200
 - 1s - loss: 0.0523 - acc: 0.9880 - val_loss: 0.0483 - val_acc: 0.9888
Epoch 105/200
 - 1s - loss: 0.0517 - acc: 0.9882 - val_loss: 0.0482 - val_acc: 0.9888
Epoch 106/200
 - 1s - loss: 0.0524 - acc: 0.9876 - val_loss: 0.0482 - val_acc: 0.9888
Epoch 107/200
 - 1s - loss: 0.0516 - acc: 0.9868 - val_loss: 0.0482 - val_acc: 0.9888
Epoch 108/200
 - 1s - loss: 0.0517 - acc: 0.9879 - val_loss: 0.0481 - val_acc: 0.9888
Epoch 109/200
 - 1s - loss: 0.0516 - acc: 0.9881 - val_loss: 0.0481 - val_acc: 0.9888
Epoch 110/200
 - 1s - loss: 0.0519 - acc: 0.9873 - val_loss: 0.0481 - val_acc: 0.9890
Epoch 111/200
 - 1s - loss: 0.0517 - acc: 0.9878 - val_loss: 0.0481 - val_acc: 0.9890
Epoch 112/200
 - 1s - loss: 0.0520 - acc: 0.9882 - val_loss: 0.0480 - val_acc: 0.9890
Epoch 113/200
 - 1s - loss: 0.0515 - acc: 0.9874 - val_loss: 0.0480 - val_acc: 0.9890
Epoch 114/200
 - 1s - loss: 0.0518 - acc: 0.9880 - val_loss: 0.0480 - val_acc: 0.9890
Epoch 115/200
 - 1s - loss: 0.0513 - acc: 0.9876 - val_loss: 0.0479 - val_acc: 0.9890
Epoch 116/200
 - 1s - loss: 0.0511 - acc: 0.9877 - val_loss: 0.0479 - val_acc: 0.9890
Epoch 117/200
 - 1s - loss: 0.0510 - acc: 0.9881 - val_loss: 0.0479 - val_acc: 0.9890
Epoch 118/200
 - 1s - loss: 0.0513 - acc: 0.9879 - val_loss: 0.0479 - val_acc: 0.9890
Epoch 119/200
 - 1s - loss: 0.0516 - acc: 0.9872 - val_loss: 0.0478 - val_acc: 0.9890
Epoch 120/200
 - 1s - loss: 0.0509 - acc: 0.9882 - val_loss: 0.0478 - val_acc: 0.9890
Epoch 121/200
 - 1s - loss: 0.0519 - acc: 0.9870 - val_loss: 0.0478 - val_acc: 0.9890
Epoch 122/200
 - 1s - loss: 0.0509 - acc: 0.9884 - val_loss: 0.0477 - val_acc: 0.9890
Epoch 123/200
 - 1s - loss: 0.0513 - acc: 0.9869 - val_loss: 0.0477 - val_acc: 0.9890
Epoch 124/200
 - 1s - loss: 0.0514 - acc: 0.9882 - val_loss: 0.0477 - val_acc: 0.9890
Epoch 125/200
 - 1s - loss: 0.0515 - acc: 0.9875 - val_loss: 0.0477 - val_acc: 0.9890
Epoch 126/200
 - 1s - loss: 0.0513 - acc: 0.9882 - val_loss: 0.0476 - val_acc: 0.9890
Epoch 127/200
 - 1s - loss: 0.0516 - acc: 0.9873 - val_loss: 0.0476 - val_acc: 0.9890
Epoch 128/200
 - 1s - loss: 0.0509 - acc: 0.9877 - val_loss: 0.0476 - val_acc: 0.9890
Epoch 129/200
 - 1s - loss: 0.0512 - acc: 0.9878 - val_loss: 0.0476 - val_acc: 0.9890
Epoch 130/200
 - 1s - loss: 0.0511 - acc: 0.9875 - val_loss: 0.0475 - val_acc: 0.9890
Epoch 131/200
 - 1s - loss: 0.0511 - acc: 0.9878 - val_loss: 0.0475 - val_acc: 0.9890
Epoch 132/200
 - 1s - loss: 0.0515 - acc: 0.9867 - val_loss: 0.0475 - val_acc: 0.9890
Epoch 133/200
 - 1s - loss: 0.0510 - acc: 0.9873 - val_loss: 0.0475 - val_acc: 0.9890
Epoch 134/200
 - 1s - loss: 0.0502 - acc: 0.9881 - val_loss: 0.0474 - val_acc: 0.9890
Epoch 135/200
 - 1s - loss: 0.0518 - acc: 0.9872 - val_loss: 0.0474 - val_acc: 0.9890
Epoch 136/200
 - 1s - loss: 0.0512 - acc: 0.9878 - val_loss: 0.0474 - val_acc: 0.9890
Epoch 137/200
 - 1s - loss: 0.0495 - acc: 0.9891 - val_loss: 0.0474 - val_acc: 0.9890
Epoch 138/200
 - 1s - loss: 0.0503 - acc: 0.9883 - val_loss: 0.0473 - val_acc: 0.9890
Epoch 139/200
 - 1s - loss: 0.0506 - acc: 0.9883 - val_loss: 0.0473 - val_acc: 0.9892
Epoch 140/200
 - 1s - loss: 0.0512 - acc: 0.9876 - val_loss: 0.0473 - val_acc: 0.9892
Epoch 141/200
 - 1s - loss: 0.0506 - acc: 0.9879 - val_loss: 0.0473 - val_acc: 0.9892
Epoch 142/200
 - 1s - loss: 0.0511 - acc: 0.9877 - val_loss: 0.0473 - val_acc: 0.9892
Epoch 143/200
 - 1s - loss: 0.0510 - acc: 0.9872 - val_loss: 0.0472 - val_acc: 0.9892
Epoch 144/200
 - 1s - loss: 0.0508 - acc: 0.9879 - val_loss: 0.0472 - val_acc: 0.9894
Epoch 145/200
 - 1s - loss: 0.0511 - acc: 0.9881 - val_loss: 0.0472 - val_acc: 0.9894
Epoch 146/200
 - 1s - loss: 0.0510 - acc: 0.9878 - val_loss: 0.0472 - val_acc: 0.9894
Epoch 147/200
 - 1s - loss: 0.0504 - acc: 0.9883 - val_loss: 0.0472 - val_acc: 0.9894
Epoch 148/200
 - 1s - loss: 0.0517 - acc: 0.9872 - val_loss: 0.0471 - val_acc: 0.9894
Epoch 149/200
 - 1s - loss: 0.0505 - acc: 0.9879 - val_loss: 0.0471 - val_acc: 0.9894
Epoch 150/200
 - 1s - loss: 0.0510 - acc: 0.9879 - val_loss: 0.0471 - val_acc: 0.9894
Epoch 151/200
 - 1s - loss: 0.0500 - acc: 0.9883 - val_loss: 0.0471 - val_acc: 0.9894
Epoch 152/200
 - 1s - loss: 0.0502 - acc: 0.9884 - val_loss: 0.0470 - val_acc: 0.9894
Epoch 153/200
 - 1s - loss: 0.0501 - acc: 0.9885 - val_loss: 0.0470 - val_acc: 0.9894
Epoch 154/200
 - 1s - loss: 0.0506 - acc: 0.9872 - val_loss: 0.0470 - val_acc: 0.9894
Epoch 155/200
 - 1s - loss: 0.0505 - acc: 0.9876 - val_loss: 0.0470 - val_acc: 0.9894
Epoch 156/200
 - 1s - loss: 0.0504 - acc: 0.9878 - val_loss: 0.0470 - val_acc: 0.9894
Epoch 157/200
 - 1s - loss: 0.0493 - acc: 0.9883 - val_loss: 0.0470 - val_acc: 0.9894
Epoch 158/200
 - 1s - loss: 0.0501 - acc: 0.9880 - val_loss: 0.0469 - val_acc: 0.9894
Epoch 159/200
 - 1s - loss: 0.0503 - acc: 0.9874 - val_loss: 0.0469 - val_acc: 0.9896
Epoch 160/200
 - 1s - loss: 0.0502 - acc: 0.9876 - val_loss: 0.0469 - val_acc: 0.9896
Epoch 161/200
 - 1s - loss: 0.0504 - acc: 0.9874 - val_loss: 0.0469 - val_acc: 0.9896
Epoch 162/200
 - 1s - loss: 0.0502 - acc: 0.9882 - val_loss: 0.0469 - val_acc: 0.9896
Epoch 163/200
 - 1s - loss: 0.0502 - acc: 0.9882 - val_loss: 0.0468 - val_acc: 0.9896
Epoch 164/200
 - 1s - loss: 0.0498 - acc: 0.9878 - val_loss: 0.0468 - val_acc: 0.9896
Epoch 165/200
 - 1s - loss: 0.0498 - acc: 0.9882 - val_loss: 0.0468 - val_acc: 0.9896
Epoch 166/200
 - 1s - loss: 0.0505 - acc: 0.9874 - val_loss: 0.0468 - val_acc: 0.9896
Epoch 167/200
 - 1s - loss: 0.0507 - acc: 0.9874 - val_loss: 0.0468 - val_acc: 0.9896
Epoch 168/200
 - 1s - loss: 0.0506 - acc: 0.9877 - val_loss: 0.0467 - val_acc: 0.9896
Epoch 169/200
 - 1s - loss: 0.0500 - acc: 0.9882 - val_loss: 0.0467 - val_acc: 0.9896
Epoch 170/200
 - 1s - loss: 0.0512 - acc: 0.9876 - val_loss: 0.0467 - val_acc: 0.9896
Epoch 171/200
 - 1s - loss: 0.0502 - acc: 0.9882 - val_loss: 0.0467 - val_acc: 0.9896
Epoch 172/200
 - 1s - loss: 0.0503 - acc: 0.9878 - val_loss: 0.0467 - val_acc: 0.9896
Epoch 173/200
 - 1s - loss: 0.0499 - acc: 0.9882 - val_loss: 0.0467 - val_acc: 0.9896
Epoch 174/200
 - 1s - loss: 0.0506 - acc: 0.9873 - val_loss: 0.0466 - val_acc: 0.9896
Epoch 175/200
 - 1s - loss: 0.0504 - acc: 0.9878 - val_loss: 0.0466 - val_acc: 0.9896
Epoch 176/200
 - 1s - loss: 0.0503 - acc: 0.9869 - val_loss: 0.0466 - val_acc: 0.9896
Epoch 177/200
 - 1s - loss: 0.0502 - acc: 0.9877 - val_loss: 0.0466 - val_acc: 0.9896
Epoch 178/200
 - 1s - loss: 0.0502 - acc: 0.9874 - val_loss: 0.0466 - val_acc: 0.9896
Epoch 179/200
 - 1s - loss: 0.0502 - acc: 0.9873 - val_loss: 0.0466 - val_acc: 0.9896
Epoch 180/200
 - 1s - loss: 0.0499 - acc: 0.9886 - val_loss: 0.0465 - val_acc: 0.9896
Epoch 181/200
 - 1s - loss: 0.0504 - acc: 0.9876 - val_loss: 0.0465 - val_acc: 0.9896
Epoch 182/200
 - 1s - loss: 0.0502 - acc: 0.9879 - val_loss: 0.0465 - val_acc: 0.9896
Epoch 183/200
 - 1s - loss: 0.0497 - acc: 0.9881 - val_loss: 0.0465 - val_acc: 0.9896
Epoch 184/200
 - 1s - loss: 0.0503 - acc: 0.9872 - val_loss: 0.0465 - val_acc: 0.9896
Epoch 185/200
 - 1s - loss: 0.0500 - acc: 0.9873 - val_loss: 0.0465 - val_acc: 0.9896
Epoch 186/200
 - 1s - loss: 0.0492 - acc: 0.9878 - val_loss: 0.0464 - val_acc: 0.9896
Epoch 187/200
 - 1s - loss: 0.0496 - acc: 0.9879 - val_loss: 0.0464 - val_acc: 0.9896
Epoch 188/200
 - 1s - loss: 0.0498 - acc: 0.9877 - val_loss: 0.0464 - val_acc: 0.9896
Epoch 189/200
 - 1s - loss: 0.0501 - acc: 0.9873 - val_loss: 0.0464 - val_acc: 0.9896
Epoch 190/200
 - 1s - loss: 0.0503 - acc: 0.9875 - val_loss: 0.0464 - val_acc: 0.9896
Epoch 191/200
 - 1s - loss: 0.0490 - acc: 0.9882 - val_loss: 0.0464 - val_acc: 0.9896
Epoch 192/200
 - 1s - loss: 0.0505 - acc: 0.9870 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 193/200
 - 1s - loss: 0.0497 - acc: 0.9877 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 194/200
 - 1s - loss: 0.0498 - acc: 0.9876 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 195/200
 - 1s - loss: 0.0501 - acc: 0.9881 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 196/200
 - 1s - loss: 0.0495 - acc: 0.9884 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 197/200
 - 1s - loss: 0.0489 - acc: 0.9881 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 198/200
 - 1s - loss: 0.0493 - acc: 0.9881 - val_loss: 0.0463 - val_acc: 0.9896
Epoch 199/200
 - 1s - loss: 0.0493 - acc: 0.9882 - val_loss: 0.0462 - val_acc: 0.9896
Epoch 200/200
 - 1s - loss: 0.0503 - acc: 0.9877 - val_loss: 0.0462 - val_acc: 0.9896
2018-03-27 08:53:17,113 [INFO] Evaluate...
2018-03-27 08:53:18,625 [INFO] Done!
2018-03-27 08:53:18,631 [INFO] tpe_transform took 0.002450 seconds
2018-03-27 08:53:18,632 [INFO] TPE using 2/2 trials with best loss 0.023905
2018-03-27 08:53:18,634 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 08:53:19,616 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0922 - acc: 0.9656 - val_loss: 0.0399 - val_acc: 0.9896
Epoch 2/200
 - 1s - loss: 0.0445 - acc: 0.9877 - val_loss: 0.0366 - val_acc: 0.9904
Epoch 3/200
 - 1s - loss: 0.0412 - acc: 0.9882 - val_loss: 0.0351 - val_acc: 0.9904
Epoch 4/200
 - 1s - loss: 0.0398 - acc: 0.9884 - val_loss: 0.0341 - val_acc: 0.9906
Epoch 5/200
 - 1s - loss: 0.0388 - acc: 0.9892 - val_loss: 0.0334 - val_acc: 0.9910
Epoch 6/200
 - 1s - loss: 0.0382 - acc: 0.9900 - val_loss: 0.0328 - val_acc: 0.9912
Epoch 7/200
 - 1s - loss: 0.0381 - acc: 0.9890 - val_loss: 0.0324 - val_acc: 0.9914
Epoch 8/200
 - 1s - loss: 0.0380 - acc: 0.9897 - val_loss: 0.0320 - val_acc: 0.9914
Epoch 9/200
 - 1s - loss: 0.0372 - acc: 0.9896 - val_loss: 0.0317 - val_acc: 0.9914
Epoch 10/200
 - 1s - loss: 0.0364 - acc: 0.9898 - val_loss: 0.0315 - val_acc: 0.9914
Epoch 11/200
 - 1s - loss: 0.0369 - acc: 0.9892 - val_loss: 0.0312 - val_acc: 0.9914
Epoch 12/200
 - 1s - loss: 0.0356 - acc: 0.9895 - val_loss: 0.0310 - val_acc: 0.9916
Epoch 13/200
 - 1s - loss: 0.0361 - acc: 0.9896 - val_loss: 0.0308 - val_acc: 0.9916
Epoch 14/200
 - 1s - loss: 0.0346 - acc: 0.9910 - val_loss: 0.0307 - val_acc: 0.9916
Epoch 15/200
 - 1s - loss: 0.0353 - acc: 0.9902 - val_loss: 0.0305 - val_acc: 0.9916
Epoch 16/200
 - 1s - loss: 0.0348 - acc: 0.9901 - val_loss: 0.0304 - val_acc: 0.9914
Epoch 17/200
 - 1s - loss: 0.0343 - acc: 0.9901 - val_loss: 0.0303 - val_acc: 0.9916
Epoch 18/200
 - 1s - loss: 0.0342 - acc: 0.9908 - val_loss: 0.0301 - val_acc: 0.9916
Epoch 19/200
 - 1s - loss: 0.0354 - acc: 0.9899 - val_loss: 0.0301 - val_acc: 0.9916
Epoch 20/200
 - 1s - loss: 0.0346 - acc: 0.9905 - val_loss: 0.0299 - val_acc: 0.9916
Epoch 21/200
 - 1s - loss: 0.0340 - acc: 0.9902 - val_loss: 0.0298 - val_acc: 0.9916
Epoch 22/200
 - 1s - loss: 0.0342 - acc: 0.9904 - val_loss: 0.0297 - val_acc: 0.9916
Epoch 23/200
 - 1s - loss: 0.0340 - acc: 0.9899 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 24/200
 - 1s - loss: 0.0340 - acc: 0.9908 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 25/200
 - 1s - loss: 0.0340 - acc: 0.9900 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 26/200
 - 1s - loss: 0.0338 - acc: 0.9905 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 27/200
 - 1s - loss: 0.0337 - acc: 0.9911 - val_loss: 0.0293 - val_acc: 0.9918
Epoch 28/200
 - 1s - loss: 0.0333 - acc: 0.9908 - val_loss: 0.0293 - val_acc: 0.9918
Epoch 29/200
 - 1s - loss: 0.0336 - acc: 0.9905 - val_loss: 0.0292 - val_acc: 0.9918
Epoch 30/200
 - 1s - loss: 0.0338 - acc: 0.9906 - val_loss: 0.0291 - val_acc: 0.9918
Epoch 31/200
 - 1s - loss: 0.0337 - acc: 0.9900 - val_loss: 0.0291 - val_acc: 0.9918
Epoch 32/200
 - 1s - loss: 0.0332 - acc: 0.9906 - val_loss: 0.0290 - val_acc: 0.9918
Epoch 33/200
 - 1s - loss: 0.0333 - acc: 0.9905 - val_loss: 0.0290 - val_acc: 0.9918
Epoch 34/200
 - 1s - loss: 0.0333 - acc: 0.9905 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 35/200
 - 1s - loss: 0.0331 - acc: 0.9909 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 36/200
 - 1s - loss: 0.0334 - acc: 0.9901 - val_loss: 0.0288 - val_acc: 0.9918
Epoch 37/200
 - 1s - loss: 0.0326 - acc: 0.9914 - val_loss: 0.0288 - val_acc: 0.9918
Epoch 38/200
 - 1s - loss: 0.0327 - acc: 0.9913 - val_loss: 0.0287 - val_acc: 0.9918
Epoch 39/200
 - 1s - loss: 0.0326 - acc: 0.9906 - val_loss: 0.0287 - val_acc: 0.9918
Epoch 40/200
 - 1s - loss: 0.0326 - acc: 0.9900 - val_loss: 0.0287 - val_acc: 0.9918
Epoch 41/200
 - 1s - loss: 0.0328 - acc: 0.9902 - val_loss: 0.0286 - val_acc: 0.9918
Epoch 42/200
 - 1s - loss: 0.0322 - acc: 0.9915 - val_loss: 0.0286 - val_acc: 0.9918
Epoch 43/200
 - 1s - loss: 0.0326 - acc: 0.9906 - val_loss: 0.0285 - val_acc: 0.9918
Epoch 44/200
 - 1s - loss: 0.0330 - acc: 0.9908 - val_loss: 0.0285 - val_acc: 0.9920
Epoch 45/200
 - 1s - loss: 0.0324 - acc: 0.9909 - val_loss: 0.0285 - val_acc: 0.9920
Epoch 46/200
 - 1s - loss: 0.0318 - acc: 0.9909 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 47/200
 - 1s - loss: 0.0323 - acc: 0.9908 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 48/200
 - 1s - loss: 0.0319 - acc: 0.9912 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 49/200
 - 1s - loss: 0.0324 - acc: 0.9907 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 50/200
 - 1s - loss: 0.0322 - acc: 0.9909 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 51/200
 - 1s - loss: 0.0327 - acc: 0.9906 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 52/200
 - 1s - loss: 0.0323 - acc: 0.9911 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 53/200
 - 1s - loss: 0.0320 - acc: 0.9906 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 54/200
 - 1s - loss: 0.0321 - acc: 0.9908 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 55/200
 - 1s - loss: 0.0318 - acc: 0.9911 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 56/200
 - 1s - loss: 0.0319 - acc: 0.9908 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 57/200
 - 1s - loss: 0.0322 - acc: 0.9909 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 58/200
 - 1s - loss: 0.0323 - acc: 0.9905 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 59/200
 - 1s - loss: 0.0323 - acc: 0.9903 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 60/200
 - 1s - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 61/200
 - 1s - loss: 0.0318 - acc: 0.9911 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 62/200
 - 1s - loss: 0.0317 - acc: 0.9910 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 63/200
 - 1s - loss: 0.0318 - acc: 0.9911 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 64/200
 - 1s - loss: 0.0323 - acc: 0.9906 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 65/200
 - 1s - loss: 0.0320 - acc: 0.9903 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 66/200
 - 1s - loss: 0.0315 - acc: 0.9913 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 67/200
 - 1s - loss: 0.0317 - acc: 0.9909 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 68/200
 - 1s - loss: 0.0314 - acc: 0.9910 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 69/200
 - 1s - loss: 0.0317 - acc: 0.9908 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 70/200
 - 1s - loss: 0.0317 - acc: 0.9914 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 71/200
 - 1s - loss: 0.0324 - acc: 0.9899 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 72/200
 - 1s - loss: 0.0317 - acc: 0.9912 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 73/200
 - 1s - loss: 0.0321 - acc: 0.9906 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 74/200
 - 1s - loss: 0.0319 - acc: 0.9909 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 75/200
 - 1s - loss: 0.0315 - acc: 0.9910 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 76/200
 - 1s - loss: 0.0320 - acc: 0.9904 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 77/200
 - 1s - loss: 0.0321 - acc: 0.9905 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 78/200
 - 1s - loss: 0.0319 - acc: 0.9910 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 79/200
 - 1s - loss: 0.0317 - acc: 0.9906 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 80/200
 - 1s - loss: 0.0314 - acc: 0.9911 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 81/200
 - 1s - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 82/200
 - 1s - loss: 0.0311 - acc: 0.9906 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 83/200
 - 1s - loss: 0.0314 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 84/200
 - 1s - loss: 0.0316 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 85/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 86/200
 - 1s - loss: 0.0314 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 87/200
 - 1s - loss: 0.0309 - acc: 0.9909 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 88/200
 - 1s - loss: 0.0317 - acc: 0.9911 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 89/200
 - 1s - loss: 0.0315 - acc: 0.9910 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 90/200
 - 1s - loss: 0.0313 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 91/200
 - 1s - loss: 0.0317 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 92/200
 - 1s - loss: 0.0313 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 93/200
 - 1s - loss: 0.0308 - acc: 0.9915 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 94/200
 - 1s - loss: 0.0314 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 95/200
 - 1s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 96/200
 - 1s - loss: 0.0306 - acc: 0.9909 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 97/200
 - 1s - loss: 0.0308 - acc: 0.9914 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 98/200
 - 1s - loss: 0.0307 - acc: 0.9911 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 99/200
 - 1s - loss: 0.0311 - acc: 0.9909 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 100/200
 - 1s - loss: 0.0313 - acc: 0.9913 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 101/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 102/200
 - 1s - loss: 0.0304 - acc: 0.9915 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 103/200
 - 1s - loss: 0.0312 - acc: 0.9908 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 104/200
 - 1s - loss: 0.0309 - acc: 0.9909 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 105/200
 - 1s - loss: 0.0315 - acc: 0.9906 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 106/200
 - 1s - loss: 0.0309 - acc: 0.9908 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 107/200
 - 1s - loss: 0.0308 - acc: 0.9913 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 108/200
 - 1s - loss: 0.0303 - acc: 0.9915 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 109/200
 - 1s - loss: 0.0309 - acc: 0.9912 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 110/200
 - 1s - loss: 0.0309 - acc: 0.9910 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 111/200
 - 1s - loss: 0.0305 - acc: 0.9913 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 112/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 113/200
 - 1s - loss: 0.0311 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 114/200
 - 1s - loss: 0.0310 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 115/200
 - 1s - loss: 0.0313 - acc: 0.9911 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 116/200
 - 1s - loss: 0.0310 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 117/200
 - 1s - loss: 0.0308 - acc: 0.9918 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 118/200
 - 1s - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 119/200
 - 1s - loss: 0.0306 - acc: 0.9915 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 120/200
 - 1s - loss: 0.0311 - acc: 0.9910 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 121/200
 - 1s - loss: 0.0304 - acc: 0.9918 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 122/200
 - 1s - loss: 0.0305 - acc: 0.9919 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 123/200
 - 1s - loss: 0.0309 - acc: 0.9907 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 124/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 125/200
 - 1s - loss: 0.0303 - acc: 0.9913 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 126/200
 - 1s - loss: 0.0310 - acc: 0.9908 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 127/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 128/200
 - 1s - loss: 0.0311 - acc: 0.9911 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 129/200
 - 1s - loss: 0.0296 - acc: 0.9922 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 130/200
 - 1s - loss: 0.0308 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 131/200
 - 1s - loss: 0.0311 - acc: 0.9905 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 132/200
 - 1s - loss: 0.0311 - acc: 0.9910 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 133/200
 - 1s - loss: 0.0307 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 134/200
 - 1s - loss: 0.0303 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 135/200
 - 1s - loss: 0.0311 - acc: 0.9909 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 136/200
 - 1s - loss: 0.0305 - acc: 0.9913 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 137/200
 - 1s - loss: 0.0305 - acc: 0.9916 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 138/200
 - 1s - loss: 0.0306 - acc: 0.9913 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 139/200
 - 1s - loss: 0.0307 - acc: 0.9915 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 140/200
 - 1s - loss: 0.0304 - acc: 0.9916 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 141/200
 - 1s - loss: 0.0308 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 142/200
 - 1s - loss: 0.0307 - acc: 0.9908 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 143/200
 - 1s - loss: 0.0309 - acc: 0.9910 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 144/200
 - 1s - loss: 0.0308 - acc: 0.9909 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 145/200
 - 1s - loss: 0.0303 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 146/200
 - 1s - loss: 0.0304 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 147/200
 - 1s - loss: 0.0304 - acc: 0.9916 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 148/200
 - 1s - loss: 0.0300 - acc: 0.9916 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 149/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 150/200
 - 1s - loss: 0.0302 - acc: 0.9917 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 151/200
 - 1s - loss: 0.0301 - acc: 0.9910 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 152/200
 - 1s - loss: 0.0308 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 153/200
 - 1s - loss: 0.0310 - acc: 0.9910 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 154/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 155/200
 - 1s - loss: 0.0307 - acc: 0.9902 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 156/200
 - 1s - loss: 0.0309 - acc: 0.9909 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 157/200
 - 1s - loss: 0.0303 - acc: 0.9908 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 158/200
 - 1s - loss: 0.0298 - acc: 0.9913 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 159/200
 - 1s - loss: 0.0305 - acc: 0.9908 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 160/200
 - 1s - loss: 0.0301 - acc: 0.9916 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 161/200
 - 1s - loss: 0.0302 - acc: 0.9913 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 162/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 163/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 164/200
 - 1s - loss: 0.0300 - acc: 0.9910 - val_loss: 0.0267 - val_acc: 0.9922
Epoch 165/200
 - 1s - loss: 0.0302 - acc: 0.9913 - val_loss: 0.0267 - val_acc: 0.9922
Epoch 166/200
 - 1s - loss: 0.0302 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 167/200
 - 1s - loss: 0.0310 - acc: 0.9909 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 168/200
 - 1s - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 169/200
 - 1s - loss: 0.0303 - acc: 0.9915 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 170/200
 - 1s - loss: 0.0297 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 171/200
 - 1s - loss: 0.0301 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 172/200
 - 1s - loss: 0.0300 - acc: 0.9920 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 173/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 174/200
 - 1s - loss: 0.0305 - acc: 0.9913 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 175/200
 - 1s - loss: 0.0302 - acc: 0.9912 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 176/200
 - 1s - loss: 0.0307 - acc: 0.9911 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 177/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 178/200
 - 1s - loss: 0.0304 - acc: 0.9915 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 179/200
 - 1s - loss: 0.0304 - acc: 0.9908 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 180/200
 - 1s - loss: 0.0301 - acc: 0.9917 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 181/200
 - 1s - loss: 0.0303 - acc: 0.9912 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 182/200
 - 1s - loss: 0.0302 - acc: 0.9911 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 183/200
 - 1s - loss: 0.0306 - acc: 0.9909 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 184/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 185/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 186/200
 - 1s - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 187/200
 - 1s - loss: 0.0297 - acc: 0.9918 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 188/200
 - 1s - loss: 0.0301 - acc: 0.9920 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 189/200
 - 1s - loss: 0.0293 - acc: 0.9914 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 190/200
 - 1s - loss: 0.0305 - acc: 0.9912 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 191/200
 - 1s - loss: 0.0294 - acc: 0.9921 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 192/200
 - 1s - loss: 0.0305 - acc: 0.9909 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 193/200
 - 1s - loss: 0.0299 - acc: 0.9914 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 194/200
 - 1s - loss: 0.0303 - acc: 0.9918 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 195/200
 - 1s - loss: 0.0303 - acc: 0.9910 - val_loss: 0.0264 - val_acc: 0.9922
Epoch 196/200
 - 1s - loss: 0.0304 - acc: 0.9912 - val_loss: 0.0264 - val_acc: 0.9922
Epoch 197/200
 - 1s - loss: 0.0303 - acc: 0.9908 - val_loss: 0.0264 - val_acc: 0.9922
Epoch 198/200
 - 1s - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0264 - val_acc: 0.9922
Epoch 199/200
 - 1s - loss: 0.0298 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9922
Epoch 200/200
 - 1s - loss: 0.0301 - acc: 0.9915 - val_loss: 0.0264 - val_acc: 0.9922
2018-03-27 08:56:16,576 [INFO] Evaluate...
2018-03-27 08:56:18,133 [INFO] Done!
2018-03-27 08:56:18,140 [INFO] tpe_transform took 0.003276 seconds
2018-03-27 08:56:18,140 [INFO] TPE using 3/3 trials with best loss 0.023905
2018-03-27 08:56:18,143 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 08:56:19,128 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0793 - acc: 0.9755 - val_loss: 0.0342 - val_acc: 0.9906
Epoch 2/200
 - 1s - loss: 0.0362 - acc: 0.9896 - val_loss: 0.0284 - val_acc: 0.9912
Epoch 3/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0261 - val_acc: 0.9914
Epoch 4/200
 - 1s - loss: 0.0302 - acc: 0.9915 - val_loss: 0.0249 - val_acc: 0.9918
Epoch 5/200
 - 1s - loss: 0.0295 - acc: 0.9906 - val_loss: 0.0238 - val_acc: 0.9922
Epoch 6/200
 - 1s - loss: 0.0279 - acc: 0.9917 - val_loss: 0.0234 - val_acc: 0.9924
Epoch 7/200
 - 1s - loss: 0.0274 - acc: 0.9920 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 8/200
 - 1s - loss: 0.0270 - acc: 0.9915 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 9/200
 - 1s - loss: 0.0265 - acc: 0.9914 - val_loss: 0.0225 - val_acc: 0.9922
Epoch 10/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 11/200
 - 1s - loss: 0.0247 - acc: 0.9924 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 12/200
 - 1s - loss: 0.0254 - acc: 0.9922 - val_loss: 0.0217 - val_acc: 0.9928
Epoch 13/200
 - 1s - loss: 0.0255 - acc: 0.9913 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 14/200
 - 1s - loss: 0.0252 - acc: 0.9925 - val_loss: 0.0215 - val_acc: 0.9922
Epoch 15/200
 - 1s - loss: 0.0241 - acc: 0.9927 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 16/200
 - 1s - loss: 0.0231 - acc: 0.9928 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 17/200
 - 1s - loss: 0.0242 - acc: 0.9923 - val_loss: 0.0212 - val_acc: 0.9926
Epoch 18/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0211 - val_acc: 0.9926
Epoch 20/200
 - 1s - loss: 0.0234 - acc: 0.9927 - val_loss: 0.0209 - val_acc: 0.9928
Epoch 21/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0209 - val_acc: 0.9928
Epoch 22/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0207 - val_acc: 0.9930
Epoch 26/200
 - 1s - loss: 0.0230 - acc: 0.9928 - val_loss: 0.0206 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0226 - acc: 0.9930 - val_loss: 0.0205 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0226 - acc: 0.9930 - val_loss: 0.0205 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0204 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0204 - val_acc: 0.9932
Epoch 33/200
 - 1s - loss: 0.0230 - acc: 0.9927 - val_loss: 0.0204 - val_acc: 0.9930
Epoch 34/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0203 - val_acc: 0.9932
Epoch 35/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0203 - val_acc: 0.9932
Epoch 36/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0203 - val_acc: 0.9932
Epoch 37/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0202 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0224 - acc: 0.9930 - val_loss: 0.0202 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0202 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0202 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0201 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0218 - acc: 0.9927 - val_loss: 0.0201 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0201 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 46/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 47/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 48/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 49/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 50/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 51/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 52/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 53/200
 - 1s - loss: 0.0223 - acc: 0.9924 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 54/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 55/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 56/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 57/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 58/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 59/200
 - 1s - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 60/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 61/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 62/200
 - 1s - loss: 0.0217 - acc: 0.9930 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 64/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 67/200
 - 1s - loss: 0.0216 - acc: 0.9930 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 68/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 70/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 71/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 72/200
 - 1s - loss: 0.0207 - acc: 0.9934 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 73/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 74/200
 - 1s - loss: 0.0204 - acc: 0.9942 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 75/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 76/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 77/200
 - 1s - loss: 0.0225 - acc: 0.9922 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 78/200
 - 1s - loss: 0.0211 - acc: 0.9939 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 79/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 80/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 81/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 82/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 83/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 84/200
 - 1s - loss: 0.0216 - acc: 0.9930 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 85/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 86/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 87/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 88/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 89/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 90/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 91/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 92/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 93/200
 - 1s - loss: 0.0213 - acc: 0.9928 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 94/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 95/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 96/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 97/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 98/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 99/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0194 - val_acc: 0.9934
Epoch 100/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 101/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 102/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 103/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 104/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 105/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 106/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 107/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 108/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 109/200
 - 1s - loss: 0.0214 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 110/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 111/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 112/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 113/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 114/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 115/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 116/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 117/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 118/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 119/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 120/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 121/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 122/200
 - 1s - loss: 0.0191 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 123/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 124/200
 - 1s - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 125/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 126/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 127/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 128/200
 - 1s - loss: 0.0196 - acc: 0.9946 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 129/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 130/200
 - 1s - loss: 0.0199 - acc: 0.9946 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 131/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 132/200
 - 1s - loss: 0.0203 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 133/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 134/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 135/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 136/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 137/200
 - 1s - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 138/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 139/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 140/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 141/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 142/200
 - 1s - loss: 0.0196 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 143/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 144/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 145/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 146/200
 - 1s - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 147/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 148/200
 - 1s - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 149/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 150/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 151/200
 - 1s - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 152/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 153/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 154/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 155/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 156/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 157/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 158/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 159/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 160/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 161/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 162/200
 - 1s - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 163/200
 - 1s - loss: 0.0193 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 164/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 165/200
 - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 166/200
 - 1s - loss: 0.0197 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 167/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 168/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 169/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 170/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 171/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 172/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 173/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 174/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 175/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 176/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 177/200
 - 1s - loss: 0.0195 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 178/200
 - 1s - loss: 0.0207 - acc: 0.9934 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 179/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 180/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 181/200
 - 1s - loss: 0.0205 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 182/200
 - 1s - loss: 0.0198 - acc: 0.9934 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 183/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 184/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 185/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 186/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 187/200
 - 1s - loss: 0.0198 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 188/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 189/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 190/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 191/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 192/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 193/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 194/200
 - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 195/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 196/200
 - 1s - loss: 0.0200 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 197/200
 - 1s - loss: 0.0201 - acc: 0.9932 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 198/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 199/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9934
Epoch 200/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0189 - val_acc: 0.9934
2018-03-27 08:59:16,832 [INFO] Evaluate...
2018-03-27 08:59:18,406 [INFO] Done!
2018-03-27 08:59:18,413 [INFO] tpe_transform took 0.002645 seconds
2018-03-27 08:59:18,414 [INFO] TPE using 4/4 trials with best loss 0.018868
2018-03-27 08:59:18,415 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 08:59:19,399 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0805 - acc: 0.9791 - val_loss: 0.0463 - val_acc: 0.9898
Epoch 2/200
 - 1s - loss: 0.0464 - acc: 0.9882 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 3/200
 - 1s - loss: 0.0422 - acc: 0.9895 - val_loss: 0.0379 - val_acc: 0.9904
Epoch 4/200
 - 1s - loss: 0.0396 - acc: 0.9905 - val_loss: 0.0364 - val_acc: 0.9904
Epoch 5/200
 - 1s - loss: 0.0383 - acc: 0.9902 - val_loss: 0.0354 - val_acc: 0.9904
Epoch 6/200
 - 1s - loss: 0.0374 - acc: 0.9903 - val_loss: 0.0346 - val_acc: 0.9910
Epoch 7/200
 - 1s - loss: 0.0369 - acc: 0.9900 - val_loss: 0.0340 - val_acc: 0.9910
Epoch 8/200
 - 1s - loss: 0.0360 - acc: 0.9903 - val_loss: 0.0335 - val_acc: 0.9910
Epoch 9/200
 - 1s - loss: 0.0355 - acc: 0.9905 - val_loss: 0.0331 - val_acc: 0.9910
Epoch 10/200
 - 1s - loss: 0.0349 - acc: 0.9912 - val_loss: 0.0327 - val_acc: 0.9912
Epoch 11/200
 - 1s - loss: 0.0346 - acc: 0.9908 - val_loss: 0.0324 - val_acc: 0.9912
Epoch 12/200
 - 1s - loss: 0.0337 - acc: 0.9914 - val_loss: 0.0322 - val_acc: 0.9912
Epoch 13/200
 - 1s - loss: 0.0340 - acc: 0.9909 - val_loss: 0.0319 - val_acc: 0.9912
Epoch 14/200
 - 1s - loss: 0.0338 - acc: 0.9909 - val_loss: 0.0317 - val_acc: 0.9912
Epoch 15/200
 - 1s - loss: 0.0335 - acc: 0.9908 - val_loss: 0.0315 - val_acc: 0.9912
Epoch 16/200
 - 1s - loss: 0.0338 - acc: 0.9905 - val_loss: 0.0313 - val_acc: 0.9914
Epoch 17/200
 - 1s - loss: 0.0339 - acc: 0.9910 - val_loss: 0.0312 - val_acc: 0.9914
Epoch 18/200
 - 1s - loss: 0.0331 - acc: 0.9909 - val_loss: 0.0310 - val_acc: 0.9914
Epoch 19/200
 - 1s - loss: 0.0326 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9914
Epoch 20/200
 - 1s - loss: 0.0322 - acc: 0.9919 - val_loss: 0.0308 - val_acc: 0.9914
Epoch 21/200
 - 1s - loss: 0.0325 - acc: 0.9909 - val_loss: 0.0306 - val_acc: 0.9914
Epoch 22/200
 - 1s - loss: 0.0324 - acc: 0.9913 - val_loss: 0.0305 - val_acc: 0.9914
Epoch 23/200
 - 1s - loss: 0.0327 - acc: 0.9905 - val_loss: 0.0304 - val_acc: 0.9914
Epoch 24/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0303 - val_acc: 0.9916
Epoch 25/200
 - 1s - loss: 0.0322 - acc: 0.9916 - val_loss: 0.0302 - val_acc: 0.9916
Epoch 26/200
 - 1s - loss: 0.0323 - acc: 0.9917 - val_loss: 0.0301 - val_acc: 0.9916
Epoch 27/200
 - 1s - loss: 0.0316 - acc: 0.9917 - val_loss: 0.0300 - val_acc: 0.9916
Epoch 28/200
 - 1s - loss: 0.0324 - acc: 0.9917 - val_loss: 0.0300 - val_acc: 0.9916
Epoch 29/200
 - 1s - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0299 - val_acc: 0.9916
Epoch 30/200
 - 1s - loss: 0.0317 - acc: 0.9917 - val_loss: 0.0298 - val_acc: 0.9916
Epoch 31/200
 - 1s - loss: 0.0317 - acc: 0.9916 - val_loss: 0.0297 - val_acc: 0.9916
Epoch 32/200
 - 1s - loss: 0.0317 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9918
Epoch 33/200
 - 1s - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0296 - val_acc: 0.9918
Epoch 34/200
 - 1s - loss: 0.0314 - acc: 0.9916 - val_loss: 0.0295 - val_acc: 0.9918
Epoch 35/200
 - 1s - loss: 0.0314 - acc: 0.9917 - val_loss: 0.0295 - val_acc: 0.9918
Epoch 36/200
 - 1s - loss: 0.0311 - acc: 0.9917 - val_loss: 0.0294 - val_acc: 0.9918
Epoch 37/200
 - 1s - loss: 0.0311 - acc: 0.9920 - val_loss: 0.0294 - val_acc: 0.9918
Epoch 38/200
 - 1s - loss: 0.0311 - acc: 0.9910 - val_loss: 0.0293 - val_acc: 0.9918
Epoch 39/200
 - 1s - loss: 0.0306 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9918
Epoch 40/200
 - 1s - loss: 0.0306 - acc: 0.9913 - val_loss: 0.0292 - val_acc: 0.9918
Epoch 41/200
 - 1s - loss: 0.0310 - acc: 0.9917 - val_loss: 0.0292 - val_acc: 0.9918
Epoch 42/200
 - 1s - loss: 0.0309 - acc: 0.9911 - val_loss: 0.0291 - val_acc: 0.9918
Epoch 43/200
 - 1s - loss: 0.0304 - acc: 0.9913 - val_loss: 0.0291 - val_acc: 0.9918
Epoch 44/200
 - 1s - loss: 0.0305 - acc: 0.9915 - val_loss: 0.0290 - val_acc: 0.9918
Epoch 45/200
 - 1s - loss: 0.0308 - acc: 0.9918 - val_loss: 0.0290 - val_acc: 0.9918
Epoch 46/200
 - 1s - loss: 0.0306 - acc: 0.9915 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 47/200
 - 1s - loss: 0.0303 - acc: 0.9917 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 48/200
 - 1s - loss: 0.0301 - acc: 0.9917 - val_loss: 0.0289 - val_acc: 0.9920
Epoch 49/200
 - 1s - loss: 0.0304 - acc: 0.9922 - val_loss: 0.0288 - val_acc: 0.9920
Epoch 50/200
 - 1s - loss: 0.0304 - acc: 0.9922 - val_loss: 0.0288 - val_acc: 0.9920
Epoch 51/200
 - 1s - loss: 0.0300 - acc: 0.9924 - val_loss: 0.0288 - val_acc: 0.9920
Epoch 52/200
 - 1s - loss: 0.0299 - acc: 0.9918 - val_loss: 0.0287 - val_acc: 0.9920
Epoch 53/200
 - 1s - loss: 0.0301 - acc: 0.9925 - val_loss: 0.0287 - val_acc: 0.9920
Epoch 54/200
 - 1s - loss: 0.0299 - acc: 0.9919 - val_loss: 0.0287 - val_acc: 0.9920
Epoch 55/200
 - 1s - loss: 0.0308 - acc: 0.9911 - val_loss: 0.0286 - val_acc: 0.9920
Epoch 56/200
 - 1s - loss: 0.0303 - acc: 0.9912 - val_loss: 0.0286 - val_acc: 0.9920
Epoch 57/200
 - 1s - loss: 0.0300 - acc: 0.9915 - val_loss: 0.0286 - val_acc: 0.9920
Epoch 58/200
 - 1s - loss: 0.0302 - acc: 0.9919 - val_loss: 0.0285 - val_acc: 0.9920
Epoch 59/200
 - 1s - loss: 0.0298 - acc: 0.9923 - val_loss: 0.0285 - val_acc: 0.9920
Epoch 60/200
 - 1s - loss: 0.0301 - acc: 0.9915 - val_loss: 0.0285 - val_acc: 0.9920
Epoch 61/200
 - 1s - loss: 0.0298 - acc: 0.9922 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 62/200
 - 1s - loss: 0.0299 - acc: 0.9920 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 63/200
 - 1s - loss: 0.0299 - acc: 0.9917 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 64/200
 - 1s - loss: 0.0295 - acc: 0.9919 - val_loss: 0.0284 - val_acc: 0.9920
Epoch 65/200
 - 1s - loss: 0.0296 - acc: 0.9920 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 66/200
 - 1s - loss: 0.0299 - acc: 0.9918 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 67/200
 - 1s - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 68/200
 - 1s - loss: 0.0297 - acc: 0.9923 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 69/200
 - 1s - loss: 0.0296 - acc: 0.9921 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 70/200
 - 1s - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 71/200
 - 1s - loss: 0.0301 - acc: 0.9915 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 72/200
 - 1s - loss: 0.0298 - acc: 0.9927 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 73/200
 - 1s - loss: 0.0294 - acc: 0.9922 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 74/200
 - 1s - loss: 0.0296 - acc: 0.9916 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 75/200
 - 1s - loss: 0.0304 - acc: 0.9919 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 76/200
 - 1s - loss: 0.0299 - acc: 0.9916 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 77/200
 - 1s - loss: 0.0297 - acc: 0.9924 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 78/200
 - 1s - loss: 0.0295 - acc: 0.9920 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 79/200
 - 1s - loss: 0.0293 - acc: 0.9921 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 80/200
 - 1s - loss: 0.0297 - acc: 0.9917 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 81/200
 - 1s - loss: 0.0295 - acc: 0.9920 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 82/200
 - 1s - loss: 0.0296 - acc: 0.9919 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 83/200
 - 1s - loss: 0.0296 - acc: 0.9919 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 84/200
 - 1s - loss: 0.0288 - acc: 0.9920 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 85/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 86/200
 - 1s - loss: 0.0290 - acc: 0.9917 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 87/200
 - 1s - loss: 0.0292 - acc: 0.9923 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 88/200
 - 1s - loss: 0.0288 - acc: 0.9923 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 89/200
 - 1s - loss: 0.0293 - acc: 0.9920 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 90/200
 - 1s - loss: 0.0291 - acc: 0.9921 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 91/200
 - 1s - loss: 0.0291 - acc: 0.9922 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 92/200
 - 1s - loss: 0.0293 - acc: 0.9918 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 93/200
 - 1s - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 94/200
 - 1s - loss: 0.0289 - acc: 0.9923 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 95/200
 - 1s - loss: 0.0291 - acc: 0.9921 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 96/200
 - 1s - loss: 0.0287 - acc: 0.9927 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 97/200
 - 1s - loss: 0.0290 - acc: 0.9922 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 98/200
 - 1s - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 99/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 100/200
 - 1s - loss: 0.0291 - acc: 0.9920 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 101/200
 - 1s - loss: 0.0290 - acc: 0.9919 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 102/200
 - 1s - loss: 0.0295 - acc: 0.9916 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 103/200
 - 1s - loss: 0.0296 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 104/200
 - 1s - loss: 0.0289 - acc: 0.9916 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 105/200
 - 1s - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0276 - val_acc: 0.9922
Epoch 106/200
 - 1s - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0276 - val_acc: 0.9922
Epoch 107/200
 - 1s - loss: 0.0293 - acc: 0.9916 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 108/200
 - 1s - loss: 0.0286 - acc: 0.9922 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 109/200
 - 1s - loss: 0.0283 - acc: 0.9924 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 110/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 111/200
 - 1s - loss: 0.0288 - acc: 0.9920 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 112/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 113/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 114/200
 - 1s - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0275 - val_acc: 0.9922
Epoch 115/200
 - 1s - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0274 - val_acc: 0.9922
Epoch 116/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9924
Epoch 117/200
 - 1s - loss: 0.0283 - acc: 0.9917 - val_loss: 0.0274 - val_acc: 0.9924
Epoch 118/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9924
Epoch 119/200
 - 1s - loss: 0.0288 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 120/200
 - 1s - loss: 0.0287 - acc: 0.9924 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 121/200
 - 1s - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 122/200
 - 1s - loss: 0.0286 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 123/200
 - 1s - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 124/200
 - 1s - loss: 0.0284 - acc: 0.9920 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 125/200
 - 1s - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 126/200
 - 1s - loss: 0.0291 - acc: 0.9921 - val_loss: 0.0273 - val_acc: 0.9924
Epoch 127/200
 - 1s - loss: 0.0287 - acc: 0.9914 - val_loss: 0.0273 - val_acc: 0.9924
Epoch 128/200
 - 1s - loss: 0.0286 - acc: 0.9919 - val_loss: 0.0273 - val_acc: 0.9924
Epoch 129/200
 - 1s - loss: 0.0284 - acc: 0.9923 - val_loss: 0.0273 - val_acc: 0.9924
Epoch 130/200
 - 1s - loss: 0.0290 - acc: 0.9917 - val_loss: 0.0273 - val_acc: 0.9924
Epoch 131/200
 - 1s - loss: 0.0279 - acc: 0.9926 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 132/200
 - 1s - loss: 0.0283 - acc: 0.9925 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 133/200
 - 1s - loss: 0.0280 - acc: 0.9924 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 134/200
 - 1s - loss: 0.0282 - acc: 0.9920 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 135/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 136/200
 - 1s - loss: 0.0285 - acc: 0.9924 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 137/200
 - 1s - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 138/200
 - 1s - loss: 0.0290 - acc: 0.9919 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 139/200
 - 1s - loss: 0.0281 - acc: 0.9921 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 140/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0272 - val_acc: 0.9924
Epoch 141/200
 - 1s - loss: 0.0282 - acc: 0.9921 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 142/200
 - 1s - loss: 0.0278 - acc: 0.9926 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 143/200
 - 1s - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 144/200
 - 1s - loss: 0.0282 - acc: 0.9923 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 145/200
 - 1s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 146/200
 - 1s - loss: 0.0278 - acc: 0.9929 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 147/200
 - 1s - loss: 0.0285 - acc: 0.9922 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 148/200
 - 1s - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 149/200
 - 1s - loss: 0.0285 - acc: 0.9919 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 150/200
 - 1s - loss: 0.0281 - acc: 0.9924 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 151/200
 - 1s - loss: 0.0281 - acc: 0.9924 - val_loss: 0.0271 - val_acc: 0.9924
Epoch 152/200
 - 1s - loss: 0.0282 - acc: 0.9923 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 153/200
 - 1s - loss: 0.0281 - acc: 0.9926 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 154/200
 - 1s - loss: 0.0284 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 155/200
 - 1s - loss: 0.0285 - acc: 0.9915 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 156/200
 - 1s - loss: 0.0279 - acc: 0.9925 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 157/200
 - 1s - loss: 0.0282 - acc: 0.9926 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 158/200
 - 1s - loss: 0.0282 - acc: 0.9917 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 159/200
 - 1s - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 160/200
 - 1s - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 161/200
 - 1s - loss: 0.0284 - acc: 0.9920 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 162/200
 - 1s - loss: 0.0287 - acc: 0.9911 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 163/200
 - 1s - loss: 0.0281 - acc: 0.9915 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 164/200
 - 1s - loss: 0.0283 - acc: 0.9920 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 165/200
 - 1s - loss: 0.0282 - acc: 0.9926 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 166/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 167/200
 - 1s - loss: 0.0279 - acc: 0.9924 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 168/200
 - 1s - loss: 0.0282 - acc: 0.9920 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 169/200
 - 1s - loss: 0.0281 - acc: 0.9924 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 170/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 171/200
 - 1s - loss: 0.0284 - acc: 0.9915 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 172/200
 - 1s - loss: 0.0287 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 173/200
 - 1s - loss: 0.0279 - acc: 0.9917 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 174/200
 - 1s - loss: 0.0281 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 175/200
 - 1s - loss: 0.0283 - acc: 0.9924 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 176/200
 - 1s - loss: 0.0281 - acc: 0.9917 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 177/200
 - 1s - loss: 0.0283 - acc: 0.9919 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 178/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 179/200
 - 1s - loss: 0.0285 - acc: 0.9918 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 180/200
 - 1s - loss: 0.0279 - acc: 0.9927 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 181/200
 - 1s - loss: 0.0281 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 182/200
 - 1s - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 183/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 184/200
 - 1s - loss: 0.0277 - acc: 0.9924 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 185/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 186/200
 - 1s - loss: 0.0282 - acc: 0.9919 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 187/200
 - 1s - loss: 0.0276 - acc: 0.9924 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 188/200
 - 1s - loss: 0.0277 - acc: 0.9925 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 189/200
 - 1s - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 190/200
 - 1s - loss: 0.0278 - acc: 0.9924 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 191/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 192/200
 - 1s - loss: 0.0283 - acc: 0.9918 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 193/200
 - 1s - loss: 0.0278 - acc: 0.9928 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 194/200
 - 1s - loss: 0.0278 - acc: 0.9919 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 195/200
 - 1s - loss: 0.0277 - acc: 0.9920 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 196/200
 - 1s - loss: 0.0280 - acc: 0.9920 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 197/200
 - 1s - loss: 0.0277 - acc: 0.9926 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 198/200
 - 1s - loss: 0.0279 - acc: 0.9923 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 199/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 200/200
 - 1s - loss: 0.0275 - acc: 0.9924 - val_loss: 0.0267 - val_acc: 0.9926
2018-03-27 09:02:17,614 [INFO] Evaluate...
2018-03-27 09:02:19,229 [INFO] Done!
2018-03-27 09:02:19,236 [INFO] tpe_transform took 0.002479 seconds
2018-03-27 09:02:19,236 [INFO] TPE using 5/5 trials with best loss 0.018868
2018-03-27 09:02:19,239 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:02:20,223 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.1308 - acc: 0.9585 - val_loss: 0.0570 - val_acc: 0.9852
Epoch 2/200
 - 1s - loss: 0.0565 - acc: 0.9857 - val_loss: 0.0456 - val_acc: 0.9870
Epoch 3/200
 - 1s - loss: 0.0476 - acc: 0.9878 - val_loss: 0.0410 - val_acc: 0.9878
Epoch 4/200
 - 1s - loss: 0.0430 - acc: 0.9888 - val_loss: 0.0384 - val_acc: 0.9884
Epoch 5/200
 - 1s - loss: 0.0415 - acc: 0.9883 - val_loss: 0.0368 - val_acc: 0.9890
Epoch 6/200
 - 1s - loss: 0.0391 - acc: 0.9891 - val_loss: 0.0356 - val_acc: 0.9888
Epoch 7/200
 - 1s - loss: 0.0381 - acc: 0.9899 - val_loss: 0.0347 - val_acc: 0.9890
Epoch 8/200
 - 1s - loss: 0.0372 - acc: 0.9895 - val_loss: 0.0339 - val_acc: 0.9888
Epoch 9/200
 - 1s - loss: 0.0357 - acc: 0.9896 - val_loss: 0.0334 - val_acc: 0.9888
Epoch 10/200
 - 1s - loss: 0.0351 - acc: 0.9902 - val_loss: 0.0329 - val_acc: 0.9890
Epoch 11/200
 - 1s - loss: 0.0339 - acc: 0.9908 - val_loss: 0.0325 - val_acc: 0.9892
Epoch 12/200
 - 1s - loss: 0.0343 - acc: 0.9909 - val_loss: 0.0321 - val_acc: 0.9892
Epoch 13/200
 - 1s - loss: 0.0342 - acc: 0.9898 - val_loss: 0.0318 - val_acc: 0.9890
Epoch 14/200
 - 1s - loss: 0.0338 - acc: 0.9893 - val_loss: 0.0315 - val_acc: 0.9896
Epoch 15/200
 - 1s - loss: 0.0330 - acc: 0.9899 - val_loss: 0.0312 - val_acc: 0.9896
Epoch 16/200
 - 1s - loss: 0.0336 - acc: 0.9905 - val_loss: 0.0310 - val_acc: 0.9898
Epoch 17/200
 - 1s - loss: 0.0331 - acc: 0.9901 - val_loss: 0.0308 - val_acc: 0.9898
Epoch 18/200
 - 1s - loss: 0.0321 - acc: 0.9900 - val_loss: 0.0306 - val_acc: 0.9898
Epoch 19/200
 - 1s - loss: 0.0325 - acc: 0.9908 - val_loss: 0.0304 - val_acc: 0.9898
Epoch 20/200
 - 1s - loss: 0.0322 - acc: 0.9907 - val_loss: 0.0302 - val_acc: 0.9900
Epoch 21/200
 - 1s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0301 - val_acc: 0.9900
Epoch 22/200
 - 1s - loss: 0.0315 - acc: 0.9911 - val_loss: 0.0300 - val_acc: 0.9900
Epoch 23/200
 - 1s - loss: 0.0309 - acc: 0.9910 - val_loss: 0.0298 - val_acc: 0.9900
Epoch 24/200
 - 1s - loss: 0.0319 - acc: 0.9909 - val_loss: 0.0297 - val_acc: 0.9900
Epoch 25/200
 - 1s - loss: 0.0303 - acc: 0.9912 - val_loss: 0.0296 - val_acc: 0.9900
Epoch 26/200
 - 1s - loss: 0.0317 - acc: 0.9904 - val_loss: 0.0295 - val_acc: 0.9900
Epoch 27/200
 - 1s - loss: 0.0296 - acc: 0.9911 - val_loss: 0.0294 - val_acc: 0.9900
Epoch 28/200
 - 1s - loss: 0.0318 - acc: 0.9906 - val_loss: 0.0293 - val_acc: 0.9900
Epoch 29/200
 - 1s - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0292 - val_acc: 0.9900
Epoch 30/200
 - 1s - loss: 0.0297 - acc: 0.9909 - val_loss: 0.0291 - val_acc: 0.9900
Epoch 31/200
 - 1s - loss: 0.0302 - acc: 0.9913 - val_loss: 0.0290 - val_acc: 0.9902
Epoch 32/200
 - 1s - loss: 0.0305 - acc: 0.9909 - val_loss: 0.0289 - val_acc: 0.9900
Epoch 33/200
 - 1s - loss: 0.0295 - acc: 0.9913 - val_loss: 0.0288 - val_acc: 0.9906
Epoch 34/200
 - 1s - loss: 0.0289 - acc: 0.9919 - val_loss: 0.0288 - val_acc: 0.9906
Epoch 35/200
 - 1s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.0287 - val_acc: 0.9906
Epoch 36/200
 - 1s - loss: 0.0304 - acc: 0.9917 - val_loss: 0.0286 - val_acc: 0.9906
Epoch 37/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0286 - val_acc: 0.9906
Epoch 38/200
 - 1s - loss: 0.0307 - acc: 0.9911 - val_loss: 0.0285 - val_acc: 0.9906
Epoch 39/200
 - 1s - loss: 0.0291 - acc: 0.9910 - val_loss: 0.0284 - val_acc: 0.9906
Epoch 40/200
 - 1s - loss: 0.0295 - acc: 0.9914 - val_loss: 0.0284 - val_acc: 0.9906
Epoch 41/200
 - 1s - loss: 0.0296 - acc: 0.9914 - val_loss: 0.0283 - val_acc: 0.9906
Epoch 42/200
 - 1s - loss: 0.0292 - acc: 0.9911 - val_loss: 0.0283 - val_acc: 0.9906
Epoch 43/200
 - 1s - loss: 0.0299 - acc: 0.9906 - val_loss: 0.0282 - val_acc: 0.9906
Epoch 44/200
 - 1s - loss: 0.0298 - acc: 0.9912 - val_loss: 0.0282 - val_acc: 0.9906
Epoch 45/200
 - 1s - loss: 0.0297 - acc: 0.9910 - val_loss: 0.0281 - val_acc: 0.9906
Epoch 46/200
 - 1s - loss: 0.0287 - acc: 0.9917 - val_loss: 0.0281 - val_acc: 0.9906
Epoch 47/200
 - 1s - loss: 0.0273 - acc: 0.9922 - val_loss: 0.0280 - val_acc: 0.9906
Epoch 48/200
 - 1s - loss: 0.0284 - acc: 0.9918 - val_loss: 0.0280 - val_acc: 0.9906
Epoch 49/200
 - 1s - loss: 0.0283 - acc: 0.9919 - val_loss: 0.0279 - val_acc: 0.9906
Epoch 50/200
 - 1s - loss: 0.0293 - acc: 0.9910 - val_loss: 0.0279 - val_acc: 0.9906
Epoch 51/200
 - 1s - loss: 0.0288 - acc: 0.9918 - val_loss: 0.0278 - val_acc: 0.9906
Epoch 52/200
 - 1s - loss: 0.0291 - acc: 0.9913 - val_loss: 0.0278 - val_acc: 0.9906
Epoch 53/200
 - 1s - loss: 0.0289 - acc: 0.9912 - val_loss: 0.0278 - val_acc: 0.9906
Epoch 54/200
 - 1s - loss: 0.0289 - acc: 0.9914 - val_loss: 0.0277 - val_acc: 0.9906
Epoch 55/200
 - 1s - loss: 0.0287 - acc: 0.9919 - val_loss: 0.0277 - val_acc: 0.9906
Epoch 56/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0277 - val_acc: 0.9906
Epoch 57/200
 - 1s - loss: 0.0286 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9906
Epoch 58/200
 - 1s - loss: 0.0291 - acc: 0.9909 - val_loss: 0.0276 - val_acc: 0.9908
Epoch 59/200
 - 1s - loss: 0.0275 - acc: 0.9918 - val_loss: 0.0276 - val_acc: 0.9908
Epoch 60/200
 - 1s - loss: 0.0282 - acc: 0.9919 - val_loss: 0.0275 - val_acc: 0.9908
Epoch 61/200
 - 1s - loss: 0.0280 - acc: 0.9914 - val_loss: 0.0275 - val_acc: 0.9908
Epoch 62/200
 - 1s - loss: 0.0277 - acc: 0.9921 - val_loss: 0.0275 - val_acc: 0.9908
Epoch 63/200
 - 1s - loss: 0.0281 - acc: 0.9915 - val_loss: 0.0274 - val_acc: 0.9908
Epoch 64/200
 - 1s - loss: 0.0266 - acc: 0.9928 - val_loss: 0.0274 - val_acc: 0.9908
Epoch 65/200
 - 1s - loss: 0.0277 - acc: 0.9924 - val_loss: 0.0274 - val_acc: 0.9908
Epoch 66/200
 - 1s - loss: 0.0286 - acc: 0.9917 - val_loss: 0.0274 - val_acc: 0.9908
Epoch 67/200
 - 1s - loss: 0.0281 - acc: 0.9917 - val_loss: 0.0273 - val_acc: 0.9908
Epoch 68/200
 - 1s - loss: 0.0273 - acc: 0.9915 - val_loss: 0.0273 - val_acc: 0.9908
Epoch 69/200
 - 1s - loss: 0.0282 - acc: 0.9911 - val_loss: 0.0273 - val_acc: 0.9908
Epoch 70/200
 - 1s - loss: 0.0279 - acc: 0.9917 - val_loss: 0.0272 - val_acc: 0.9908
Epoch 71/200
 - 1s - loss: 0.0278 - acc: 0.9918 - val_loss: 0.0272 - val_acc: 0.9908
Epoch 72/200
 - 1s - loss: 0.0289 - acc: 0.9909 - val_loss: 0.0272 - val_acc: 0.9908
Epoch 73/200
 - 1s - loss: 0.0283 - acc: 0.9913 - val_loss: 0.0272 - val_acc: 0.9908
Epoch 74/200
 - 1s - loss: 0.0281 - acc: 0.9911 - val_loss: 0.0271 - val_acc: 0.9908
Epoch 75/200
 - 1s - loss: 0.0283 - acc: 0.9914 - val_loss: 0.0271 - val_acc: 0.9908
Epoch 76/200
 - 1s - loss: 0.0275 - acc: 0.9917 - val_loss: 0.0271 - val_acc: 0.9910
Epoch 77/200
 - 1s - loss: 0.0271 - acc: 0.9926 - val_loss: 0.0271 - val_acc: 0.9910
Epoch 78/200
 - 1s - loss: 0.0282 - acc: 0.9920 - val_loss: 0.0270 - val_acc: 0.9908
Epoch 79/200
 - 1s - loss: 0.0267 - acc: 0.9918 - val_loss: 0.0270 - val_acc: 0.9910
Epoch 80/200
 - 1s - loss: 0.0276 - acc: 0.9920 - val_loss: 0.0270 - val_acc: 0.9910
Epoch 81/200
 - 1s - loss: 0.0277 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9910
Epoch 82/200
 - 1s - loss: 0.0278 - acc: 0.9917 - val_loss: 0.0270 - val_acc: 0.9910
Epoch 83/200
 - 1s - loss: 0.0272 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9912
Epoch 84/200
 - 1s - loss: 0.0278 - acc: 0.9920 - val_loss: 0.0269 - val_acc: 0.9910
Epoch 85/200
 - 1s - loss: 0.0273 - acc: 0.9918 - val_loss: 0.0269 - val_acc: 0.9910
Epoch 86/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0269 - val_acc: 0.9910
Epoch 87/200
 - 1s - loss: 0.0284 - acc: 0.9911 - val_loss: 0.0269 - val_acc: 0.9912
Epoch 88/200
 - 1s - loss: 0.0275 - acc: 0.9917 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 89/200
 - 1s - loss: 0.0269 - acc: 0.9916 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 90/200
 - 1s - loss: 0.0281 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 91/200
 - 1s - loss: 0.0275 - acc: 0.9919 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 92/200
 - 1s - loss: 0.0270 - acc: 0.9924 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 93/200
 - 1s - loss: 0.0272 - acc: 0.9919 - val_loss: 0.0268 - val_acc: 0.9912
Epoch 94/200
 - 1s - loss: 0.0282 - acc: 0.9912 - val_loss: 0.0267 - val_acc: 0.9912
Epoch 95/200
 - 1s - loss: 0.0279 - acc: 0.9912 - val_loss: 0.0267 - val_acc: 0.9912
Epoch 96/200
 - 1s - loss: 0.0264 - acc: 0.9915 - val_loss: 0.0267 - val_acc: 0.9912
Epoch 97/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0267 - val_acc: 0.9912
Epoch 98/200
 - 1s - loss: 0.0272 - acc: 0.9921 - val_loss: 0.0267 - val_acc: 0.9912
Epoch 99/200
 - 1s - loss: 0.0272 - acc: 0.9913 - val_loss: 0.0267 - val_acc: 0.9912
Epoch 100/200
 - 1s - loss: 0.0270 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 101/200
 - 1s - loss: 0.0270 - acc: 0.9918 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 102/200
 - 1s - loss: 0.0265 - acc: 0.9920 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 103/200
 - 1s - loss: 0.0269 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 104/200
 - 1s - loss: 0.0276 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 105/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 106/200
 - 1s - loss: 0.0263 - acc: 0.9922 - val_loss: 0.0266 - val_acc: 0.9912
Epoch 107/200
 - 1s - loss: 0.0268 - acc: 0.9922 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 108/200
 - 1s - loss: 0.0268 - acc: 0.9919 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 109/200
 - 1s - loss: 0.0263 - acc: 0.9920 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 110/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 111/200
 - 1s - loss: 0.0270 - acc: 0.9918 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 112/200
 - 1s - loss: 0.0267 - acc: 0.9925 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 113/200
 - 1s - loss: 0.0272 - acc: 0.9915 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 114/200
 - 1s - loss: 0.0268 - acc: 0.9924 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 115/200
 - 1s - loss: 0.0257 - acc: 0.9923 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 116/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 117/200
 - 1s - loss: 0.0264 - acc: 0.9921 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 118/200
 - 1s - loss: 0.0273 - acc: 0.9916 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 119/200
 - 1s - loss: 0.0271 - acc: 0.9919 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 120/200
 - 1s - loss: 0.0274 - acc: 0.9914 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 121/200
 - 1s - loss: 0.0259 - acc: 0.9925 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 122/200
 - 1s - loss: 0.0269 - acc: 0.9917 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 123/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 124/200
 - 1s - loss: 0.0268 - acc: 0.9914 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 125/200
 - 1s - loss: 0.0265 - acc: 0.9924 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 126/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 127/200
 - 1s - loss: 0.0263 - acc: 0.9926 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 128/200
 - 1s - loss: 0.0268 - acc: 0.9920 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 129/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 130/200
 - 1s - loss: 0.0263 - acc: 0.9919 - val_loss: 0.0263 - val_acc: 0.9912
Epoch 131/200
 - 1s - loss: 0.0272 - acc: 0.9916 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 132/200
 - 1s - loss: 0.0264 - acc: 0.9922 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 133/200
 - 1s - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 134/200
 - 1s - loss: 0.0264 - acc: 0.9924 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 135/200
 - 1s - loss: 0.0257 - acc: 0.9920 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 136/200
 - 1s - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 137/200
 - 1s - loss: 0.0267 - acc: 0.9920 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 138/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 139/200
 - 1s - loss: 0.0264 - acc: 0.9919 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 140/200
 - 1s - loss: 0.0266 - acc: 0.9921 - val_loss: 0.0262 - val_acc: 0.9912
Epoch 141/200
 - 1s - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 142/200
 - 1s - loss: 0.0265 - acc: 0.9915 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 143/200
 - 1s - loss: 0.0263 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 144/200
 - 1s - loss: 0.0265 - acc: 0.9917 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 145/200
 - 1s - loss: 0.0265 - acc: 0.9917 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 146/200
 - 1s - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 147/200
 - 1s - loss: 0.0262 - acc: 0.9921 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 148/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 149/200
 - 1s - loss: 0.0261 - acc: 0.9916 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 150/200
 - 1s - loss: 0.0258 - acc: 0.9923 - val_loss: 0.0261 - val_acc: 0.9912
Epoch 151/200
 - 1s - loss: 0.0266 - acc: 0.9914 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 152/200
 - 1s - loss: 0.0271 - acc: 0.9914 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 153/200
 - 1s - loss: 0.0259 - acc: 0.9921 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 154/200
 - 1s - loss: 0.0264 - acc: 0.9924 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 155/200
 - 1s - loss: 0.0258 - acc: 0.9923 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 156/200
 - 1s - loss: 0.0255 - acc: 0.9926 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 157/200
 - 1s - loss: 0.0259 - acc: 0.9924 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 158/200
 - 1s - loss: 0.0271 - acc: 0.9919 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 159/200
 - 1s - loss: 0.0271 - acc: 0.9919 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 160/200
 - 1s - loss: 0.0279 - acc: 0.9915 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 161/200
 - 1s - loss: 0.0258 - acc: 0.9924 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 162/200
 - 1s - loss: 0.0259 - acc: 0.9915 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 163/200
 - 1s - loss: 0.0266 - acc: 0.9920 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 164/200
 - 1s - loss: 0.0252 - acc: 0.9930 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 165/200
 - 1s - loss: 0.0268 - acc: 0.9922 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 166/200
 - 1s - loss: 0.0261 - acc: 0.9920 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 167/200
 - 1s - loss: 0.0262 - acc: 0.9917 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 168/200
 - 1s - loss: 0.0268 - acc: 0.9914 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 169/200
 - 1s - loss: 0.0255 - acc: 0.9926 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 170/200
 - 1s - loss: 0.0256 - acc: 0.9924 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 171/200
 - 1s - loss: 0.0257 - acc: 0.9922 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 172/200
 - 1s - loss: 0.0264 - acc: 0.9920 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 173/200
 - 1s - loss: 0.0263 - acc: 0.9918 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 174/200
 - 1s - loss: 0.0266 - acc: 0.9914 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 175/200
 - 1s - loss: 0.0267 - acc: 0.9918 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 176/200
 - 1s - loss: 0.0273 - acc: 0.9920 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 177/200
 - 1s - loss: 0.0257 - acc: 0.9924 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 178/200
 - 1s - loss: 0.0257 - acc: 0.9925 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 179/200
 - 1s - loss: 0.0256 - acc: 0.9923 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 180/200
 - 1s - loss: 0.0255 - acc: 0.9924 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 181/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 182/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 183/200
 - 1s - loss: 0.0260 - acc: 0.9926 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 184/200
 - 1s - loss: 0.0268 - acc: 0.9917 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 185/200
 - 1s - loss: 0.0255 - acc: 0.9922 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 186/200
 - 1s - loss: 0.0259 - acc: 0.9922 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 187/200
 - 1s - loss: 0.0262 - acc: 0.9926 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 188/200
 - 1s - loss: 0.0257 - acc: 0.9925 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 189/200
 - 1s - loss: 0.0257 - acc: 0.9917 - val_loss: 0.0258 - val_acc: 0.9912
Epoch 190/200
 - 1s - loss: 0.0254 - acc: 0.9927 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 191/200
 - 1s - loss: 0.0250 - acc: 0.9928 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 192/200
 - 1s - loss: 0.0259 - acc: 0.9919 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 193/200
 - 1s - loss: 0.0261 - acc: 0.9921 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 194/200
 - 1s - loss: 0.0256 - acc: 0.9923 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 195/200
 - 1s - loss: 0.0257 - acc: 0.9926 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 196/200
 - 1s - loss: 0.0260 - acc: 0.9924 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 197/200
 - 1s - loss: 0.0248 - acc: 0.9924 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 198/200
 - 1s - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 199/200
 - 1s - loss: 0.0261 - acc: 0.9918 - val_loss: 0.0257 - val_acc: 0.9912
Epoch 200/200
 - 1s - loss: 0.0256 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9912
2018-03-27 09:05:18,665 [INFO] Evaluate...
2018-03-27 09:05:20,319 [INFO] Done!
2018-03-27 09:05:20,326 [INFO] tpe_transform took 0.003263 seconds
2018-03-27 09:05:20,326 [INFO] TPE using 6/6 trials with best loss 0.018868
2018-03-27 09:05:20,328 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:05:21,330 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0726 - acc: 0.9737 - val_loss: 0.0367 - val_acc: 0.9904
Epoch 2/200
 - 1s - loss: 0.0377 - acc: 0.9904 - val_loss: 0.0327 - val_acc: 0.9910
Epoch 3/200
 - 1s - loss: 0.0352 - acc: 0.9910 - val_loss: 0.0311 - val_acc: 0.9918
Epoch 4/200
 - 1s - loss: 0.0337 - acc: 0.9914 - val_loss: 0.0300 - val_acc: 0.9918
Epoch 5/200
 - 1s - loss: 0.0327 - acc: 0.9913 - val_loss: 0.0292 - val_acc: 0.9918
Epoch 6/200
 - 1s - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0287 - val_acc: 0.9918
Epoch 7/200
 - 1s - loss: 0.0311 - acc: 0.9919 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 8/200
 - 1s - loss: 0.0313 - acc: 0.9918 - val_loss: 0.0279 - val_acc: 0.9922
Epoch 9/200
 - 1s - loss: 0.0304 - acc: 0.9922 - val_loss: 0.0276 - val_acc: 0.9922
Epoch 10/200
 - 1s - loss: 0.0307 - acc: 0.9920 - val_loss: 0.0274 - val_acc: 0.9922
Epoch 11/200
 - 1s - loss: 0.0298 - acc: 0.9927 - val_loss: 0.0271 - val_acc: 0.9922
Epoch 12/200
 - 1s - loss: 0.0298 - acc: 0.9920 - val_loss: 0.0270 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0296 - acc: 0.9919 - val_loss: 0.0268 - val_acc: 0.9924
Epoch 14/200
 - 1s - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0293 - acc: 0.9926 - val_loss: 0.0265 - val_acc: 0.9926
Epoch 16/200
 - 1s - loss: 0.0296 - acc: 0.9923 - val_loss: 0.0264 - val_acc: 0.9926
Epoch 17/200
 - 1s - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0262 - val_acc: 0.9926
Epoch 18/200
 - 1s - loss: 0.0291 - acc: 0.9920 - val_loss: 0.0261 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0288 - acc: 0.9922 - val_loss: 0.0260 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0284 - acc: 0.9927 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 21/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 22/200
 - 1s - loss: 0.0287 - acc: 0.9921 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0286 - acc: 0.9923 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0283 - acc: 0.9924 - val_loss: 0.0256 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0285 - acc: 0.9922 - val_loss: 0.0255 - val_acc: 0.9930
Epoch 26/200
 - 1s - loss: 0.0282 - acc: 0.9925 - val_loss: 0.0255 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0284 - acc: 0.9922 - val_loss: 0.0254 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0282 - acc: 0.9921 - val_loss: 0.0253 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0253 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0280 - acc: 0.9920 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0282 - acc: 0.9923 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 33/200
 - 1s - loss: 0.0280 - acc: 0.9926 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 34/200
 - 1s - loss: 0.0275 - acc: 0.9924 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 35/200
 - 1s - loss: 0.0276 - acc: 0.9931 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 36/200
 - 1s - loss: 0.0278 - acc: 0.9923 - val_loss: 0.0249 - val_acc: 0.9932
Epoch 37/200
 - 1s - loss: 0.0275 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0276 - acc: 0.9921 - val_loss: 0.0249 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0273 - acc: 0.9926 - val_loss: 0.0248 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0274 - acc: 0.9922 - val_loss: 0.0248 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0275 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0271 - acc: 0.9928 - val_loss: 0.0247 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0274 - acc: 0.9924 - val_loss: 0.0247 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0271 - acc: 0.9930 - val_loss: 0.0246 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0275 - acc: 0.9925 - val_loss: 0.0246 - val_acc: 0.9932
Epoch 46/200
 - 1s - loss: 0.0271 - acc: 0.9927 - val_loss: 0.0246 - val_acc: 0.9932
Epoch 47/200
 - 1s - loss: 0.0270 - acc: 0.9926 - val_loss: 0.0245 - val_acc: 0.9932
Epoch 48/200
 - 1s - loss: 0.0275 - acc: 0.9923 - val_loss: 0.0245 - val_acc: 0.9932
Epoch 49/200
 - 1s - loss: 0.0271 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9932
Epoch 50/200
 - 1s - loss: 0.0270 - acc: 0.9928 - val_loss: 0.0245 - val_acc: 0.9934
Epoch 51/200
 - 1s - loss: 0.0272 - acc: 0.9931 - val_loss: 0.0244 - val_acc: 0.9934
Epoch 52/200
 - 1s - loss: 0.0270 - acc: 0.9927 - val_loss: 0.0244 - val_acc: 0.9934
Epoch 53/200
 - 1s - loss: 0.0265 - acc: 0.9928 - val_loss: 0.0244 - val_acc: 0.9934
Epoch 54/200
 - 1s - loss: 0.0264 - acc: 0.9928 - val_loss: 0.0243 - val_acc: 0.9934
Epoch 55/200
 - 1s - loss: 0.0264 - acc: 0.9932 - val_loss: 0.0243 - val_acc: 0.9934
Epoch 56/200
 - 1s - loss: 0.0267 - acc: 0.9926 - val_loss: 0.0243 - val_acc: 0.9934
Epoch 57/200
 - 1s - loss: 0.0269 - acc: 0.9923 - val_loss: 0.0243 - val_acc: 0.9934
Epoch 58/200
 - 1s - loss: 0.0271 - acc: 0.9927 - val_loss: 0.0243 - val_acc: 0.9934
Epoch 59/200
 - 1s - loss: 0.0270 - acc: 0.9924 - val_loss: 0.0242 - val_acc: 0.9934
Epoch 60/200
 - 1s - loss: 0.0267 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9934
Epoch 61/200
 - 1s - loss: 0.0267 - acc: 0.9926 - val_loss: 0.0242 - val_acc: 0.9934
Epoch 62/200
 - 1s - loss: 0.0268 - acc: 0.9923 - val_loss: 0.0242 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0267 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 64/200
 - 1s - loss: 0.0266 - acc: 0.9928 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0263 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0264 - acc: 0.9923 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 67/200
 - 1s - loss: 0.0266 - acc: 0.9922 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 68/200
 - 1s - loss: 0.0264 - acc: 0.9928 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0264 - acc: 0.9926 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 70/200
 - 1s - loss: 0.0265 - acc: 0.9924 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 71/200
 - 1s - loss: 0.0263 - acc: 0.9923 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 72/200
 - 1s - loss: 0.0264 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 73/200
 - 1s - loss: 0.0261 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 74/200
 - 1s - loss: 0.0263 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 75/200
 - 1s - loss: 0.0260 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 76/200
 - 1s - loss: 0.0263 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 77/200
 - 1s - loss: 0.0262 - acc: 0.9926 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 78/200
 - 1s - loss: 0.0264 - acc: 0.9923 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 79/200
 - 1s - loss: 0.0267 - acc: 0.9924 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 80/200
 - 1s - loss: 0.0262 - acc: 0.9933 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 81/200
 - 1s - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 82/200
 - 1s - loss: 0.0263 - acc: 0.9928 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 85/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 86/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0238 - val_acc: 0.9936
Epoch 87/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 88/200
 - 1s - loss: 0.0263 - acc: 0.9925 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 89/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 90/200
 - 1s - loss: 0.0258 - acc: 0.9930 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 91/200
 - 1s - loss: 0.0264 - acc: 0.9928 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 92/200
 - 1s - loss: 0.0261 - acc: 0.9929 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 93/200
 - 1s - loss: 0.0261 - acc: 0.9929 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 94/200
 - 1s - loss: 0.0263 - acc: 0.9927 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 95/200
 - 1s - loss: 0.0262 - acc: 0.9920 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 96/200
 - 1s - loss: 0.0259 - acc: 0.9926 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 97/200
 - 1s - loss: 0.0263 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 98/200
 - 1s - loss: 0.0261 - acc: 0.9927 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 99/200
 - 1s - loss: 0.0262 - acc: 0.9927 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 100/200
 - 1s - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 101/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 102/200
 - 1s - loss: 0.0261 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 103/200
 - 1s - loss: 0.0262 - acc: 0.9924 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 104/200
 - 1s - loss: 0.0260 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 105/200
 - 1s - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 106/200
 - 1s - loss: 0.0258 - acc: 0.9928 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 107/200
 - 1s - loss: 0.0257 - acc: 0.9930 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 108/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 109/200
 - 1s - loss: 0.0265 - acc: 0.9926 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 110/200
 - 1s - loss: 0.0260 - acc: 0.9926 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 111/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 112/200
 - 1s - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 113/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 114/200
 - 1s - loss: 0.0257 - acc: 0.9930 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 115/200
 - 1s - loss: 0.0259 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 116/200
 - 1s - loss: 0.0260 - acc: 0.9924 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 117/200
 - 1s - loss: 0.0258 - acc: 0.9924 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 118/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 119/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0262 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0234 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0257 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0257 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0254 - acc: 0.9926 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 127/200
 - 1s - loss: 0.0255 - acc: 0.9926 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 128/200
 - 1s - loss: 0.0254 - acc: 0.9932 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 129/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 131/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 132/200
 - 1s - loss: 0.0256 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 133/200
 - 1s - loss: 0.0255 - acc: 0.9933 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 134/200
 - 1s - loss: 0.0253 - acc: 0.9933 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 135/200
 - 1s - loss: 0.0255 - acc: 0.9932 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 136/200
 - 1s - loss: 0.0251 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 137/200
 - 1s - loss: 0.0250 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 138/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 139/200
 - 1s - loss: 0.0255 - acc: 0.9926 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 140/200
 - 1s - loss: 0.0254 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0256 - acc: 0.9931 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 142/200
 - 1s - loss: 0.0253 - acc: 0.9933 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0259 - acc: 0.9923 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0253 - acc: 0.9926 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 145/200
 - 1s - loss: 0.0252 - acc: 0.9935 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0256 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 147/200
 - 1s - loss: 0.0254 - acc: 0.9926 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 148/200
 - 1s - loss: 0.0258 - acc: 0.9923 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 149/200
 - 1s - loss: 0.0254 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 150/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 151/200
 - 1s - loss: 0.0250 - acc: 0.9932 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 152/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 153/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 154/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 155/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 156/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 157/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 158/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 159/200
 - 1s - loss: 0.0253 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 160/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 161/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 162/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 163/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 164/200
 - 1s - loss: 0.0251 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 165/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 166/200
 - 1s - loss: 0.0252 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 167/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 168/200
 - 1s - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 169/200
 - 1s - loss: 0.0255 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 170/200
 - 1s - loss: 0.0249 - acc: 0.9930 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 171/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 172/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 173/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 174/200
 - 1s - loss: 0.0250 - acc: 0.9927 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 175/200
 - 1s - loss: 0.0251 - acc: 0.9930 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0253 - acc: 0.9925 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 177/200
 - 1s - loss: 0.0253 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 178/200
 - 1s - loss: 0.0255 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 179/200
 - 1s - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 180/200
 - 1s - loss: 0.0251 - acc: 0.9930 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 181/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 182/200
 - 1s - loss: 0.0254 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 183/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 184/200
 - 1s - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 185/200
 - 1s - loss: 0.0252 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 186/200
 - 1s - loss: 0.0250 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 187/200
 - 1s - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 188/200
 - 1s - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 189/200
 - 1s - loss: 0.0250 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 191/200
 - 1s - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 192/200
 - 1s - loss: 0.0252 - acc: 0.9928 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 193/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 194/200
 - 1s - loss: 0.0249 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 195/200
 - 1s - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 196/200
 - 1s - loss: 0.0252 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 197/200
 - 1s - loss: 0.0245 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 198/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 199/200
 - 1s - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 200/200
 - 1s - loss: 0.0250 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
2018-03-27 09:08:20,208 [INFO] Evaluate...
2018-03-27 09:08:21,908 [INFO] Done!
2018-03-27 09:08:21,914 [INFO] tpe_transform took 0.002502 seconds
2018-03-27 09:08:21,914 [INFO] TPE using 7/7 trials with best loss 0.018868
2018-03-27 09:08:21,916 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:08:22,907 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0680 - acc: 0.9757 - val_loss: 0.0352 - val_acc: 0.9910
Epoch 2/200
 - 1s - loss: 0.0358 - acc: 0.9905 - val_loss: 0.0318 - val_acc: 0.9918
Epoch 3/200
 - 1s - loss: 0.0338 - acc: 0.9903 - val_loss: 0.0302 - val_acc: 0.9918
Epoch 4/200
 - 1s - loss: 0.0318 - acc: 0.9918 - val_loss: 0.0293 - val_acc: 0.9918
Epoch 5/200
 - 1s - loss: 0.0312 - acc: 0.9918 - val_loss: 0.0286 - val_acc: 0.9918
Epoch 6/200
 - 1s - loss: 0.0301 - acc: 0.9922 - val_loss: 0.0281 - val_acc: 0.9918
Epoch 7/200
 - 1s - loss: 0.0300 - acc: 0.9919 - val_loss: 0.0278 - val_acc: 0.9916
Epoch 8/200
 - 1s - loss: 0.0297 - acc: 0.9922 - val_loss: 0.0274 - val_acc: 0.9922
Epoch 9/200
 - 1s - loss: 0.0291 - acc: 0.9922 - val_loss: 0.0272 - val_acc: 0.9922
Epoch 10/200
 - 1s - loss: 0.0292 - acc: 0.9923 - val_loss: 0.0269 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0288 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0286 - acc: 0.9924 - val_loss: 0.0265 - val_acc: 0.9926
Epoch 13/200
 - 1s - loss: 0.0280 - acc: 0.9926 - val_loss: 0.0264 - val_acc: 0.9926
Epoch 14/200
 - 1s - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0262 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0278 - acc: 0.9922 - val_loss: 0.0261 - val_acc: 0.9926
Epoch 16/200
 - 1s - loss: 0.0281 - acc: 0.9924 - val_loss: 0.0260 - val_acc: 0.9928
Epoch 17/200
 - 1s - loss: 0.0276 - acc: 0.9929 - val_loss: 0.0259 - val_acc: 0.9926
Epoch 18/200
 - 1s - loss: 0.0277 - acc: 0.9930 - val_loss: 0.0258 - val_acc: 0.9928
Epoch 19/200
 - 1s - loss: 0.0275 - acc: 0.9927 - val_loss: 0.0257 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0274 - acc: 0.9925 - val_loss: 0.0256 - val_acc: 0.9928
Epoch 21/200
 - 1s - loss: 0.0278 - acc: 0.9920 - val_loss: 0.0255 - val_acc: 0.9928
Epoch 22/200
 - 1s - loss: 0.0273 - acc: 0.9924 - val_loss: 0.0254 - val_acc: 0.9928
Epoch 23/200
 - 1s - loss: 0.0268 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9928
Epoch 24/200
 - 1s - loss: 0.0270 - acc: 0.9927 - val_loss: 0.0253 - val_acc: 0.9928
Epoch 25/200
 - 1s - loss: 0.0269 - acc: 0.9929 - val_loss: 0.0252 - val_acc: 0.9928
Epoch 26/200
 - 1s - loss: 0.0268 - acc: 0.9924 - val_loss: 0.0252 - val_acc: 0.9928
Epoch 27/200
 - 1s - loss: 0.0268 - acc: 0.9932 - val_loss: 0.0251 - val_acc: 0.9928
Epoch 28/200
 - 1s - loss: 0.0267 - acc: 0.9929 - val_loss: 0.0250 - val_acc: 0.9928
Epoch 29/200
 - 1s - loss: 0.0266 - acc: 0.9926 - val_loss: 0.0250 - val_acc: 0.9928
Epoch 30/200
 - 1s - loss: 0.0263 - acc: 0.9927 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 31/200
 - 1s - loss: 0.0267 - acc: 0.9927 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 32/200
 - 1s - loss: 0.0269 - acc: 0.9925 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 33/200
 - 1s - loss: 0.0261 - acc: 0.9930 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 34/200
 - 1s - loss: 0.0264 - acc: 0.9930 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 35/200
 - 1s - loss: 0.0261 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 36/200
 - 1s - loss: 0.0261 - acc: 0.9925 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 37/200
 - 1s - loss: 0.0262 - acc: 0.9931 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 38/200
 - 1s - loss: 0.0264 - acc: 0.9930 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 39/200
 - 1s - loss: 0.0262 - acc: 0.9927 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 40/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 41/200
 - 1s - loss: 0.0260 - acc: 0.9923 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 42/200
 - 1s - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 43/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 44/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 45/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 46/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 47/200
 - 1s - loss: 0.0260 - acc: 0.9925 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 48/200
 - 1s - loss: 0.0259 - acc: 0.9928 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 49/200
 - 1s - loss: 0.0256 - acc: 0.9933 - val_loss: 0.0242 - val_acc: 0.9928
Epoch 50/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9928
Epoch 51/200
 - 1s - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9928
Epoch 52/200
 - 1s - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0242 - val_acc: 0.9928
Epoch 53/200
 - 1s - loss: 0.0260 - acc: 0.9926 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 54/200
 - 1s - loss: 0.0258 - acc: 0.9928 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 55/200
 - 1s - loss: 0.0260 - acc: 0.9933 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 56/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 57/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 58/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0258 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 60/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 61/200
 - 1s - loss: 0.0255 - acc: 0.9926 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 62/200
 - 1s - loss: 0.0252 - acc: 0.9936 - val_loss: 0.0239 - val_acc: 0.9928
Epoch 63/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9928
Epoch 64/200
 - 1s - loss: 0.0256 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0256 - acc: 0.9932 - val_loss: 0.0239 - val_acc: 0.9928
Epoch 66/200
 - 1s - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0239 - val_acc: 0.9928
Epoch 67/200
 - 1s - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0239 - val_acc: 0.9928
Epoch 68/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 69/200
 - 1s - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 70/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 71/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 72/200
 - 1s - loss: 0.0254 - acc: 0.9932 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 73/200
 - 1s - loss: 0.0256 - acc: 0.9927 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 74/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 75/200
 - 1s - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 76/200
 - 1s - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 77/200
 - 1s - loss: 0.0252 - acc: 0.9936 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 78/200
 - 1s - loss: 0.0254 - acc: 0.9930 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 79/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0252 - acc: 0.9933 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 81/200
 - 1s - loss: 0.0250 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 82/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 83/200
 - 1s - loss: 0.0251 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 84/200
 - 1s - loss: 0.0250 - acc: 0.9932 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 85/200
 - 1s - loss: 0.0250 - acc: 0.9932 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 86/200
 - 1s - loss: 0.0248 - acc: 0.9933 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 87/200
 - 1s - loss: 0.0248 - acc: 0.9935 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 88/200
 - 1s - loss: 0.0249 - acc: 0.9930 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 89/200
 - 1s - loss: 0.0247 - acc: 0.9933 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 90/200
 - 1s - loss: 0.0251 - acc: 0.9934 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 91/200
 - 1s - loss: 0.0251 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 92/200
 - 1s - loss: 0.0251 - acc: 0.9935 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 93/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 94/200
 - 1s - loss: 0.0249 - acc: 0.9933 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 95/200
 - 1s - loss: 0.0250 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 96/200
 - 1s - loss: 0.0251 - acc: 0.9930 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 97/200
 - 1s - loss: 0.0248 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 98/200
 - 1s - loss: 0.0248 - acc: 0.9936 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 99/200
 - 1s - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 100/200
 - 1s - loss: 0.0245 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 101/200
 - 1s - loss: 0.0249 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 102/200
 - 1s - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 103/200
 - 1s - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 104/200
 - 1s - loss: 0.0249 - acc: 0.9934 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 105/200
 - 1s - loss: 0.0250 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 106/200
 - 1s - loss: 0.0248 - acc: 0.9936 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 107/200
 - 1s - loss: 0.0251 - acc: 0.9935 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 108/200
 - 1s - loss: 0.0246 - acc: 0.9936 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 109/200
 - 1s - loss: 0.0251 - acc: 0.9930 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 110/200
 - 1s - loss: 0.0249 - acc: 0.9934 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 111/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 112/200
 - 1s - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 113/200
 - 1s - loss: 0.0249 - acc: 0.9933 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 114/200
 - 1s - loss: 0.0244 - acc: 0.9935 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 115/200
 - 1s - loss: 0.0247 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 116/200
 - 1s - loss: 0.0246 - acc: 0.9935 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 117/200
 - 1s - loss: 0.0242 - acc: 0.9934 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 118/200
 - 1s - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 119/200
 - 1s - loss: 0.0248 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 120/200
 - 1s - loss: 0.0249 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 121/200
 - 1s - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 122/200
 - 1s - loss: 0.0244 - acc: 0.9933 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 123/200
 - 1s - loss: 0.0245 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 124/200
 - 1s - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 125/200
 - 1s - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 126/200
 - 1s - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 127/200
 - 1s - loss: 0.0245 - acc: 0.9935 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 128/200
 - 1s - loss: 0.0246 - acc: 0.9936 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 129/200
 - 1s - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 130/200
 - 1s - loss: 0.0244 - acc: 0.9933 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 131/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 132/200
 - 1s - loss: 0.0245 - acc: 0.9932 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 133/200
 - 1s - loss: 0.0244 - acc: 0.9936 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 134/200
 - 1s - loss: 0.0246 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 135/200
 - 1s - loss: 0.0245 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 136/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 137/200
 - 1s - loss: 0.0247 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 138/200
 - 1s - loss: 0.0244 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 139/200
 - 1s - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 140/200
 - 1s - loss: 0.0241 - acc: 0.9937 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 141/200
 - 1s - loss: 0.0246 - acc: 0.9933 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 142/200
 - 1s - loss: 0.0242 - acc: 0.9936 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 143/200
 - 1s - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 144/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 145/200
 - 1s - loss: 0.0245 - acc: 0.9933 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 146/200
 - 1s - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 147/200
 - 1s - loss: 0.0244 - acc: 0.9935 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 148/200
 - 1s - loss: 0.0237 - acc: 0.9935 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 149/200
 - 1s - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 150/200
 - 1s - loss: 0.0248 - acc: 0.9934 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 151/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 152/200
 - 1s - loss: 0.0240 - acc: 0.9936 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 153/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 154/200
 - 1s - loss: 0.0239 - acc: 0.9936 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 155/200
 - 1s - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 156/200
 - 1s - loss: 0.0244 - acc: 0.9935 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 157/200
 - 1s - loss: 0.0243 - acc: 0.9935 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 158/200
 - 1s - loss: 0.0243 - acc: 0.9936 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 159/200
 - 1s - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 160/200
 - 1s - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 161/200
 - 1s - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 162/200
 - 1s - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 163/200
 - 1s - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 164/200
 - 1s - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 165/200
 - 1s - loss: 0.0242 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 166/200
 - 1s - loss: 0.0242 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 167/200
 - 1s - loss: 0.0241 - acc: 0.9937 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 168/200
 - 1s - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 169/200
 - 1s - loss: 0.0243 - acc: 0.9935 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 170/200
 - 1s - loss: 0.0244 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 171/200
 - 1s - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 172/200
 - 1s - loss: 0.0240 - acc: 0.9937 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 173/200
 - 1s - loss: 0.0242 - acc: 0.9934 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 174/200
 - 1s - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 175/200
 - 1s - loss: 0.0244 - acc: 0.9936 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 176/200
 - 1s - loss: 0.0243 - acc: 0.9935 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 177/200
 - 1s - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 178/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 179/200
 - 1s - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 180/200
 - 1s - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 181/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 182/200
 - 1s - loss: 0.0241 - acc: 0.9934 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 183/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 184/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 185/200
 - 1s - loss: 0.0238 - acc: 0.9935 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 186/200
 - 1s - loss: 0.0239 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 187/200
 - 1s - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 188/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 189/200
 - 1s - loss: 0.0238 - acc: 0.9941 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 190/200
 - 1s - loss: 0.0241 - acc: 0.9936 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 191/200
 - 1s - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 192/200
 - 1s - loss: 0.0242 - acc: 0.9935 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 193/200
 - 1s - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 194/200
 - 1s - loss: 0.0239 - acc: 0.9937 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 195/200
 - 1s - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 196/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 197/200
 - 1s - loss: 0.0236 - acc: 0.9938 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 198/200
 - 1s - loss: 0.0238 - acc: 0.9936 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 199/200
 - 1s - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 200/200
 - 1s - loss: 0.0242 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9926
2018-03-27 09:11:21,841 [INFO] Evaluate...
2018-03-27 09:11:23,570 [INFO] Done!
2018-03-27 09:11:23,576 [INFO] tpe_transform took 0.002573 seconds
2018-03-27 09:11:23,577 [INFO] TPE using 8/8 trials with best loss 0.018868
2018-03-27 09:11:23,578 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:11:24,572 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0536 - acc: 0.9793 - val_loss: 0.0277 - val_acc: 0.9928
Epoch 2/200
 - 1s - loss: 0.0284 - acc: 0.9918 - val_loss: 0.0253 - val_acc: 0.9930
Epoch 3/200
 - 1s - loss: 0.0261 - acc: 0.9923 - val_loss: 0.0245 - val_acc: 0.9934
Epoch 4/200
 - 1s - loss: 0.0256 - acc: 0.9925 - val_loss: 0.0239 - val_acc: 0.9936
Epoch 5/200
 - 1s - loss: 0.0250 - acc: 0.9924 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 6/200
 - 1s - loss: 0.0252 - acc: 0.9923 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 7/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 10/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 11/200
 - 1s - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 14/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9942
Epoch 17/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 19/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0225 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 21/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 23/200
 - 1s - loss: 0.0223 - acc: 0.9934 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 24/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0222 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0224 - acc: 0.9930 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 37/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 39/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0221 - acc: 0.9934 - val_loss: 0.0209 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0209 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0209 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9944
Epoch 47/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9944
Epoch 48/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 50/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 51/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 52/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 53/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 54/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 55/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 56/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 57/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 58/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 59/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 60/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 61/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 62/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 63/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 64/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 65/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 66/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 67/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 68/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 69/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 71/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 74/200
 - 1s - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 75/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 76/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 77/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 78/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 79/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 80/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 81/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 82/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 83/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 84/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 85/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 86/200
 - 1s - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 87/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 88/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 89/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 90/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 91/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 92/200
 - 1s - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 93/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 94/200
 - 1s - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 95/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 96/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 97/200
 - 1s - loss: 0.0208 - acc: 0.9939 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 98/200
 - 1s - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 99/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 100/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 101/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 102/200
 - 1s - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 103/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 104/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 105/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 106/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 107/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 108/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 109/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 110/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 111/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 112/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 113/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 114/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 115/200
 - 1s - loss: 0.0208 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 116/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 117/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 118/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 119/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 120/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 121/200
 - 1s - loss: 0.0199 - acc: 0.9944 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 122/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 123/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 124/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 125/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 126/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 127/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 128/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 129/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 130/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 131/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 132/200
 - 1s - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 133/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 134/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 135/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 136/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 137/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 138/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 139/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 140/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 141/200
 - 1s - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 142/200
 - 1s - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 143/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 144/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 145/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 146/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 147/200
 - 1s - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 148/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 149/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 150/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 151/200
 - 1s - loss: 0.0200 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 152/200
 - 1s - loss: 0.0206 - acc: 0.9939 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 153/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 154/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 155/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 156/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 157/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 158/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 159/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 160/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 161/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 162/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 163/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 164/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 165/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 166/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 167/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 168/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 169/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 170/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 171/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 172/200
 - 1s - loss: 0.0201 - acc: 0.9945 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 173/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 174/200
 - 1s - loss: 0.0201 - acc: 0.9944 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 175/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 176/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 177/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 178/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 179/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 180/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 181/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 182/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 183/200
 - 1s - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 184/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 185/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 186/200
 - 1s - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 187/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 188/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 189/200
 - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 190/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 191/200
 - 1s - loss: 0.0201 - acc: 0.9933 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 192/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 193/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 194/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 195/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 196/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 197/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 198/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 199/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 200/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9944
2018-03-27 09:14:23,775 [INFO] Evaluate...
2018-03-27 09:14:25,591 [INFO] Done!
2018-03-27 09:14:25,599 [INFO] tpe_transform took 0.003267 seconds
2018-03-27 09:14:25,599 [INFO] TPE using 9/9 trials with best loss 0.018868
2018-03-27 09:14:25,601 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:14:26,604 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0465 - acc: 0.9829 - val_loss: 0.0250 - val_acc: 0.9914
Epoch 2/200
 - 1s - loss: 0.0240 - acc: 0.9918 - val_loss: 0.0231 - val_acc: 0.9916
Epoch 3/200
 - 1s - loss: 0.0232 - acc: 0.9923 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 4/200
 - 1s - loss: 0.0224 - acc: 0.9925 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 5/200
 - 1s - loss: 0.0222 - acc: 0.9925 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 6/200
 - 1s - loss: 0.0222 - acc: 0.9925 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 7/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0214 - val_acc: 0.9924
Epoch 8/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9924
Epoch 9/200
 - 1s - loss: 0.0213 - acc: 0.9930 - val_loss: 0.0211 - val_acc: 0.9924
Epoch 10/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0210 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0209 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0207 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0211 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9926
Epoch 14/200
 - 1s - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0207 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0207 - val_acc: 0.9926
Epoch 16/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0206 - val_acc: 0.9928
Epoch 17/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 32/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 33/200
 - 1s - loss: 0.0192 - acc: 0.9936 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 34/200
 - 1s - loss: 0.0195 - acc: 0.9936 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 35/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 36/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 37/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0199 - acc: 0.9932 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0197 - acc: 0.9936 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0193 - acc: 0.9939 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0194 - acc: 0.9933 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 46/200
 - 1s - loss: 0.0198 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 47/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 48/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 49/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 50/200
 - 1s - loss: 0.0194 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 51/200
 - 1s - loss: 0.0194 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 52/200
 - 1s - loss: 0.0196 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 53/200
 - 1s - loss: 0.0192 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 54/200
 - 1s - loss: 0.0195 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 55/200
 - 1s - loss: 0.0193 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 56/200
 - 1s - loss: 0.0193 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 57/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 58/200
 - 1s - loss: 0.0192 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 59/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 60/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 61/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 62/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 63/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 64/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 65/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 66/200
 - 1s - loss: 0.0191 - acc: 0.9939 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 67/200
 - 1s - loss: 0.0192 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 68/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 69/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 70/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 71/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 72/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 73/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 74/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 75/200
 - 1s - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 76/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 77/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 78/200
 - 1s - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 79/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 80/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 81/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 82/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 83/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 84/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 85/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 86/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 87/200
 - 1s - loss: 0.0193 - acc: 0.9942 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 88/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 89/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 90/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 91/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 92/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 93/200
 - 1s - loss: 0.0193 - acc: 0.9942 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 94/200
 - 1s - loss: 0.0193 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 95/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 96/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 97/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 98/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 99/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9934
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 158/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 159/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 160/200
 - 1s - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 161/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 162/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 163/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 164/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 165/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 166/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 167/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 168/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 169/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 170/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 171/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 172/200
 - 1s - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 173/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 174/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 175/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 176/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 177/200
 - 1s - loss: 0.0182 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 178/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 179/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 180/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 181/200
 - 1s - loss: 0.0179 - acc: 0.9949 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 182/200
 - 1s - loss: 0.0185 - acc: 0.9945 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 183/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 184/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 185/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 186/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 187/200
 - 1s - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 188/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 189/200
 - 1s - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 190/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 191/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 192/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 193/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 194/200
 - 1s - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 195/200
 - 1s - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 196/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 197/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 198/200
 - 1s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 199/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 200/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9936
2018-03-27 09:17:26,153 [INFO] Evaluate...
2018-03-27 09:17:27,935 [INFO] Done!
2018-03-27 09:17:27,942 [INFO] tpe_transform took 0.002517 seconds
2018-03-27 09:17:27,942 [INFO] TPE using 10/10 trials with best loss 0.018868
2018-03-27 09:17:27,945 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:17:28,933 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.1402 - acc: 0.9696 - val_loss: 0.0825 - val_acc: 0.9846
Epoch 2/200
 - 1s - loss: 0.0770 - acc: 0.9863 - val_loss: 0.0702 - val_acc: 0.9862
Epoch 3/200
 - 1s - loss: 0.0691 - acc: 0.9864 - val_loss: 0.0649 - val_acc: 0.9868
Epoch 4/200
 - 1s - loss: 0.0654 - acc: 0.9875 - val_loss: 0.0618 - val_acc: 0.9870
Epoch 5/200
 - 1s - loss: 0.0620 - acc: 0.9879 - val_loss: 0.0597 - val_acc: 0.9872
Epoch 6/200
 - 1s - loss: 0.0599 - acc: 0.9883 - val_loss: 0.0581 - val_acc: 0.9876
Epoch 7/200
 - 1s - loss: 0.0578 - acc: 0.9882 - val_loss: 0.0568 - val_acc: 0.9882
Epoch 8/200
 - 1s - loss: 0.0576 - acc: 0.9879 - val_loss: 0.0558 - val_acc: 0.9884
Epoch 9/200
 - 1s - loss: 0.0566 - acc: 0.9881 - val_loss: 0.0550 - val_acc: 0.9886
Epoch 10/200
 - 1s - loss: 0.0554 - acc: 0.9895 - val_loss: 0.0542 - val_acc: 0.9886
Epoch 11/200
 - 1s - loss: 0.0551 - acc: 0.9887 - val_loss: 0.0536 - val_acc: 0.9886
Epoch 12/200
 - 1s - loss: 0.0541 - acc: 0.9896 - val_loss: 0.0530 - val_acc: 0.9888
Epoch 13/200
 - 1s - loss: 0.0538 - acc: 0.9890 - val_loss: 0.0525 - val_acc: 0.9888
Epoch 14/200
 - 1s - loss: 0.0536 - acc: 0.9887 - val_loss: 0.0521 - val_acc: 0.9888
Epoch 15/200
 - 1s - loss: 0.0527 - acc: 0.9895 - val_loss: 0.0517 - val_acc: 0.9888
Epoch 16/200
 - 1s - loss: 0.0527 - acc: 0.9887 - val_loss: 0.0513 - val_acc: 0.9888
Epoch 17/200
 - 1s - loss: 0.0516 - acc: 0.9893 - val_loss: 0.0510 - val_acc: 0.9888
Epoch 18/200
 - 1s - loss: 0.0510 - acc: 0.9898 - val_loss: 0.0507 - val_acc: 0.9890
Epoch 19/200
 - 1s - loss: 0.0517 - acc: 0.9890 - val_loss: 0.0504 - val_acc: 0.9890
Epoch 20/200
 - 1s - loss: 0.0507 - acc: 0.9897 - val_loss: 0.0501 - val_acc: 0.9890
Epoch 21/200
 - 1s - loss: 0.0507 - acc: 0.9883 - val_loss: 0.0498 - val_acc: 0.9890
Epoch 22/200
 - 1s - loss: 0.0509 - acc: 0.9888 - val_loss: 0.0496 - val_acc: 0.9890
Epoch 23/200
 - 1s - loss: 0.0499 - acc: 0.9896 - val_loss: 0.0494 - val_acc: 0.9890
Epoch 24/200
 - 1s - loss: 0.0505 - acc: 0.9894 - val_loss: 0.0492 - val_acc: 0.9890
Epoch 25/200
 - 1s - loss: 0.0504 - acc: 0.9897 - val_loss: 0.0490 - val_acc: 0.9892
Epoch 26/200
 - 1s - loss: 0.0493 - acc: 0.9902 - val_loss: 0.0488 - val_acc: 0.9894
Epoch 27/200
 - 1s - loss: 0.0492 - acc: 0.9901 - val_loss: 0.0486 - val_acc: 0.9892
Epoch 28/200
 - 1s - loss: 0.0496 - acc: 0.9901 - val_loss: 0.0484 - val_acc: 0.9894
Epoch 29/200
 - 1s - loss: 0.0491 - acc: 0.9897 - val_loss: 0.0483 - val_acc: 0.9894
Epoch 30/200
 - 1s - loss: 0.0490 - acc: 0.9896 - val_loss: 0.0481 - val_acc: 0.9894
Epoch 31/200
 - 1s - loss: 0.0484 - acc: 0.9896 - val_loss: 0.0480 - val_acc: 0.9892
Epoch 32/200
 - 1s - loss: 0.0487 - acc: 0.9896 - val_loss: 0.0478 - val_acc: 0.9892
Epoch 33/200
 - 1s - loss: 0.0481 - acc: 0.9896 - val_loss: 0.0477 - val_acc: 0.9892
Epoch 34/200
 - 1s - loss: 0.0481 - acc: 0.9899 - val_loss: 0.0476 - val_acc: 0.9892
Epoch 35/200
 - 1s - loss: 0.0479 - acc: 0.9899 - val_loss: 0.0474 - val_acc: 0.9892
Epoch 36/200
 - 1s - loss: 0.0479 - acc: 0.9894 - val_loss: 0.0473 - val_acc: 0.9892
Epoch 37/200
 - 1s - loss: 0.0475 - acc: 0.9898 - val_loss: 0.0472 - val_acc: 0.9892
Epoch 38/200
 - 1s - loss: 0.0478 - acc: 0.9900 - val_loss: 0.0471 - val_acc: 0.9892
Epoch 39/200
 - 1s - loss: 0.0474 - acc: 0.9899 - val_loss: 0.0470 - val_acc: 0.9892
Epoch 40/200
 - 1s - loss: 0.0479 - acc: 0.9899 - val_loss: 0.0469 - val_acc: 0.9892
Epoch 41/200
 - 1s - loss: 0.0472 - acc: 0.9899 - val_loss: 0.0468 - val_acc: 0.9892
Epoch 42/200
 - 1s - loss: 0.0473 - acc: 0.9905 - val_loss: 0.0467 - val_acc: 0.9892
Epoch 43/200
 - 1s - loss: 0.0474 - acc: 0.9894 - val_loss: 0.0466 - val_acc: 0.9892
Epoch 44/200
 - 1s - loss: 0.0470 - acc: 0.9896 - val_loss: 0.0465 - val_acc: 0.9892
Epoch 45/200
 - 1s - loss: 0.0467 - acc: 0.9905 - val_loss: 0.0464 - val_acc: 0.9892
Epoch 46/200
 - 1s - loss: 0.0469 - acc: 0.9899 - val_loss: 0.0463 - val_acc: 0.9894
Epoch 47/200
 - 1s - loss: 0.0468 - acc: 0.9899 - val_loss: 0.0462 - val_acc: 0.9894
Epoch 48/200
 - 1s - loss: 0.0466 - acc: 0.9895 - val_loss: 0.0461 - val_acc: 0.9894
Epoch 49/200
 - 1s - loss: 0.0461 - acc: 0.9901 - val_loss: 0.0461 - val_acc: 0.9894
Epoch 50/200
 - 1s - loss: 0.0466 - acc: 0.9893 - val_loss: 0.0460 - val_acc: 0.9894
Epoch 51/200
 - 1s - loss: 0.0470 - acc: 0.9899 - val_loss: 0.0459 - val_acc: 0.9894
Epoch 52/200
 - 1s - loss: 0.0461 - acc: 0.9900 - val_loss: 0.0458 - val_acc: 0.9894
Epoch 53/200
 - 1s - loss: 0.0463 - acc: 0.9901 - val_loss: 0.0458 - val_acc: 0.9894
Epoch 54/200
 - 1s - loss: 0.0463 - acc: 0.9900 - val_loss: 0.0457 - val_acc: 0.9894
Epoch 55/200
 - 1s - loss: 0.0462 - acc: 0.9900 - val_loss: 0.0456 - val_acc: 0.9894
Epoch 56/200
 - 1s - loss: 0.0459 - acc: 0.9905 - val_loss: 0.0456 - val_acc: 0.9894
Epoch 57/200
 - 1s - loss: 0.0452 - acc: 0.9905 - val_loss: 0.0455 - val_acc: 0.9894
Epoch 58/200
 - 1s - loss: 0.0459 - acc: 0.9905 - val_loss: 0.0454 - val_acc: 0.9894
Epoch 59/200
 - 1s - loss: 0.0463 - acc: 0.9896 - val_loss: 0.0454 - val_acc: 0.9894
Epoch 60/200
 - 1s - loss: 0.0455 - acc: 0.9905 - val_loss: 0.0453 - val_acc: 0.9896
Epoch 61/200
 - 1s - loss: 0.0457 - acc: 0.9897 - val_loss: 0.0452 - val_acc: 0.9896
Epoch 62/200
 - 1s - loss: 0.0458 - acc: 0.9899 - val_loss: 0.0452 - val_acc: 0.9896
Epoch 63/200
 - 1s - loss: 0.0456 - acc: 0.9908 - val_loss: 0.0451 - val_acc: 0.9896
Epoch 64/200
 - 1s - loss: 0.0453 - acc: 0.9901 - val_loss: 0.0451 - val_acc: 0.9896
Epoch 65/200
 - 1s - loss: 0.0458 - acc: 0.9897 - val_loss: 0.0450 - val_acc: 0.9896
Epoch 66/200
 - 1s - loss: 0.0458 - acc: 0.9896 - val_loss: 0.0450 - val_acc: 0.9896
Epoch 67/200
 - 1s - loss: 0.0455 - acc: 0.9902 - val_loss: 0.0449 - val_acc: 0.9896
Epoch 68/200
 - 1s - loss: 0.0454 - acc: 0.9898 - val_loss: 0.0448 - val_acc: 0.9896
Epoch 69/200
 - 1s - loss: 0.0448 - acc: 0.9906 - val_loss: 0.0448 - val_acc: 0.9896
Epoch 70/200
 - 1s - loss: 0.0451 - acc: 0.9905 - val_loss: 0.0447 - val_acc: 0.9896
Epoch 71/200
 - 1s - loss: 0.0445 - acc: 0.9902 - val_loss: 0.0447 - val_acc: 0.9898
Epoch 72/200
 - 1s - loss: 0.0445 - acc: 0.9903 - val_loss: 0.0446 - val_acc: 0.9898
Epoch 73/200
 - 1s - loss: 0.0454 - acc: 0.9894 - val_loss: 0.0446 - val_acc: 0.9898
Epoch 74/200
 - 1s - loss: 0.0451 - acc: 0.9904 - val_loss: 0.0446 - val_acc: 0.9898
Epoch 75/200
 - 1s - loss: 0.0447 - acc: 0.9904 - val_loss: 0.0445 - val_acc: 0.9898
Epoch 76/200
 - 1s - loss: 0.0444 - acc: 0.9904 - val_loss: 0.0445 - val_acc: 0.9898
Epoch 77/200
 - 1s - loss: 0.0454 - acc: 0.9893 - val_loss: 0.0444 - val_acc: 0.9898
Epoch 78/200
 - 1s - loss: 0.0445 - acc: 0.9902 - val_loss: 0.0444 - val_acc: 0.9898
Epoch 79/200
 - 1s - loss: 0.0452 - acc: 0.9900 - val_loss: 0.0443 - val_acc: 0.9898
Epoch 80/200
 - 1s - loss: 0.0444 - acc: 0.9899 - val_loss: 0.0443 - val_acc: 0.9898
Epoch 81/200
 - 1s - loss: 0.0448 - acc: 0.9902 - val_loss: 0.0442 - val_acc: 0.9900
Epoch 82/200
 - 1s - loss: 0.0446 - acc: 0.9900 - val_loss: 0.0442 - val_acc: 0.9900
Epoch 83/200
 - 1s - loss: 0.0442 - acc: 0.9900 - val_loss: 0.0442 - val_acc: 0.9900
Epoch 84/200
 - 1s - loss: 0.0446 - acc: 0.9900 - val_loss: 0.0441 - val_acc: 0.9900
Epoch 85/200
 - 1s - loss: 0.0446 - acc: 0.9906 - val_loss: 0.0441 - val_acc: 0.9900
Epoch 86/200
 - 1s - loss: 0.0444 - acc: 0.9908 - val_loss: 0.0440 - val_acc: 0.9900
Epoch 87/200
 - 1s - loss: 0.0441 - acc: 0.9905 - val_loss: 0.0440 - val_acc: 0.9900
Epoch 88/200
 - 1s - loss: 0.0440 - acc: 0.9905 - val_loss: 0.0440 - val_acc: 0.9900
Epoch 89/200
 - 1s - loss: 0.0441 - acc: 0.9898 - val_loss: 0.0439 - val_acc: 0.9900
Epoch 90/200
 - 1s - loss: 0.0438 - acc: 0.9904 - val_loss: 0.0439 - val_acc: 0.9900
Epoch 91/200
 - 1s - loss: 0.0446 - acc: 0.9902 - val_loss: 0.0439 - val_acc: 0.9900
Epoch 92/200
 - 1s - loss: 0.0446 - acc: 0.9895 - val_loss: 0.0438 - val_acc: 0.9900
Epoch 93/200
 - 1s - loss: 0.0442 - acc: 0.9906 - val_loss: 0.0438 - val_acc: 0.9900
Epoch 94/200
 - 1s - loss: 0.0443 - acc: 0.9899 - val_loss: 0.0438 - val_acc: 0.9900
Epoch 95/200
 - 1s - loss: 0.0437 - acc: 0.9904 - val_loss: 0.0437 - val_acc: 0.9900
Epoch 96/200
 - 1s - loss: 0.0444 - acc: 0.9896 - val_loss: 0.0437 - val_acc: 0.9902
Epoch 97/200
 - 1s - loss: 0.0441 - acc: 0.9906 - val_loss: 0.0437 - val_acc: 0.9902
Epoch 98/200
 - 1s - loss: 0.0438 - acc: 0.9905 - val_loss: 0.0436 - val_acc: 0.9902
Epoch 99/200
 - 1s - loss: 0.0443 - acc: 0.9904 - val_loss: 0.0436 - val_acc: 0.9902
Epoch 100/200
 - 1s - loss: 0.0435 - acc: 0.9913 - val_loss: 0.0436 - val_acc: 0.9902
Epoch 101/200
 - 1s - loss: 0.0441 - acc: 0.9900 - val_loss: 0.0435 - val_acc: 0.9902
Epoch 102/200
 - 1s - loss: 0.0440 - acc: 0.9897 - val_loss: 0.0435 - val_acc: 0.9902
Epoch 103/200
 - 1s - loss: 0.0440 - acc: 0.9902 - val_loss: 0.0435 - val_acc: 0.9902
Epoch 104/200
 - 1s - loss: 0.0441 - acc: 0.9899 - val_loss: 0.0434 - val_acc: 0.9902
Epoch 105/200
 - 1s - loss: 0.0440 - acc: 0.9903 - val_loss: 0.0434 - val_acc: 0.9902
Epoch 106/200
 - 1s - loss: 0.0433 - acc: 0.9909 - val_loss: 0.0434 - val_acc: 0.9902
Epoch 107/200
 - 1s - loss: 0.0542 - acc: 0.9889 - val_loss: 0.0507 - val_acc: 0.9912
Epoch 17/200
 - 1s - loss: 0.0533 - acc: 0.9896 - val_loss: 0.0503 - val_acc: 0.9914
Epoch 18/200
 - 1s - loss: 0.0529 - acc: 0.9890 - val_loss: 0.0500 - val_acc: 0.9914
Epoch 19/200
 - 1s - loss: 0.0530 - acc: 0.9891 - val_loss: 0.0496 - val_acc: 0.9916
Epoch 20/200
 - 1s - loss: 0.0533 - acc: 0.9889 - val_loss: 0.0493 - val_acc: 0.9916
Epoch 21/200
 - 1s - loss: 0.0522 - acc: 0.9891 - val_loss: 0.0491 - val_acc: 0.9916
Epoch 22/200
 - 1s - loss: 0.0519 - acc: 0.9896 - val_loss: 0.0488 - val_acc: 0.9916
Epoch 23/200
 - 1s - loss: 0.0515 - acc: 0.9896 - val_loss: 0.0485 - val_acc: 0.9916
Epoch 24/200
 - 1s - loss: 0.0516 - acc: 0.9891 - val_loss: 0.0483 - val_acc: 0.9916
Epoch 25/200
 - 1s - loss: 0.0512 - acc: 0.9887 - val_loss: 0.0481 - val_acc: 0.9916
Epoch 26/200
 - 1s - loss: 0.0508 - acc: 0.9895 - val_loss: 0.0479 - val_acc: 0.9916
Epoch 27/200
 - 1s - loss: 0.0512 - acc: 0.9896 - val_loss: 0.0477 - val_acc: 0.9916
Epoch 28/200
 - 1s - loss: 0.0505 - acc: 0.9902 - val_loss: 0.0475 - val_acc: 0.9916
Epoch 29/200
 - 1s - loss: 0.0504 - acc: 0.9896 - val_loss: 0.0473 - val_acc: 0.9916
Epoch 30/200
 - 1s - loss: 0.0503 - acc: 0.9891 - val_loss: 0.0472 - val_acc: 0.9916
Epoch 31/200
 - 1s - loss: 0.0504 - acc: 0.9891 - val_loss: 0.0470 - val_acc: 0.9916
Epoch 32/200
 - 1s - loss: 0.0498 - acc: 0.9894 - val_loss: 0.0468 - val_acc: 0.9916
Epoch 33/200
 - 1s - loss: 0.0497 - acc: 0.9893 - val_loss: 0.0467 - val_acc: 0.9916
Epoch 34/200
 - 1s - loss: 0.0501 - acc: 0.9894 - val_loss: 0.0465 - val_acc: 0.9916
Epoch 35/200
 - 1s - loss: 0.0499 - acc: 0.9896 - val_loss: 0.0464 - val_acc: 0.9916
Epoch 36/200
 - 1s - loss: 0.0494 - acc: 0.9894 - val_loss: 0.0463 - val_acc: 0.9916
Epoch 37/200
 - 1s - loss: 0.0495 - acc: 0.9893 - val_loss: 0.0461 - val_acc: 0.9916
Epoch 38/200
 - 1s - loss: 0.0491 - acc: 0.9891 - val_loss: 0.0460 - val_acc: 0.9916
Epoch 39/200
 - 1s - loss: 0.0497 - acc: 0.9893 - val_loss: 0.0459 - val_acc: 0.9918
Epoch 40/200
 - 1s - loss: 0.0492 - acc: 0.9891 - val_loss: 0.0458 - val_acc: 0.9918
Epoch 41/200
 - 1s - loss: 0.0492 - acc: 0.9898 - val_loss: 0.0457 - val_acc: 0.9918
Epoch 42/200
 - 1s - loss: 0.0485 - acc: 0.9900 - val_loss: 0.0456 - val_acc: 0.9918
Epoch 43/200
 - 1s - loss: 0.0482 - acc: 0.9893 - val_loss: 0.0455 - val_acc: 0.9918
Epoch 44/200
 - 1s - loss: 0.0486 - acc: 0.9896 - val_loss: 0.0454 - val_acc: 0.9916
Epoch 45/200
 - 1s - loss: 0.0484 - acc: 0.9899 - val_loss: 0.0453 - val_acc: 0.9916
Epoch 46/200
 - 1s - loss: 0.0482 - acc: 0.9896 - val_loss: 0.0452 - val_acc: 0.9916
Epoch 47/200
 - 1s - loss: 0.0485 - acc: 0.9890 - val_loss: 0.0451 - val_acc: 0.9916
Epoch 48/200
 - 1s - loss: 0.0480 - acc: 0.9898 - val_loss: 0.0450 - val_acc: 0.9916
Epoch 49/200
 - 1s - loss: 0.0482 - acc: 0.9894 - val_loss: 0.0449 - val_acc: 0.9916
Epoch 50/200
 - 1s - loss: 0.0482 - acc: 0.9898 - val_loss: 0.0448 - val_acc: 0.9916
Epoch 51/200
 - 1s - loss: 0.0480 - acc: 0.9895 - val_loss: 0.0447 - val_acc: 0.9916
Epoch 52/200
 - 1s - loss: 0.0478 - acc: 0.9898 - val_loss: 0.0446 - val_acc: 0.9916
Epoch 53/200
 - 1s - loss: 0.0477 - acc: 0.9896 - val_loss: 0.0446 - val_acc: 0.9916
Epoch 54/200
 - 1s - loss: 0.0477 - acc: 0.9896 - val_loss: 0.0445 - val_acc: 0.9916
Epoch 55/200
 - 1s - loss: 0.0478 - acc: 0.9897 - val_loss: 0.0444 - val_acc: 0.9916
Epoch 56/200
 - 1s - loss: 0.0475 - acc: 0.9898 - val_loss: 0.0443 - val_acc: 0.9916
Epoch 57/200
 - 1s - loss: 0.0472 - acc: 0.9906 - val_loss: 0.0443 - val_acc: 0.9916
Epoch 58/200
 - 1s - loss: 0.0476 - acc: 0.9894 - val_loss: 0.0442 - val_acc: 0.9916
Epoch 59/200
 - 1s - loss: 0.0474 - acc: 0.9900 - val_loss: 0.0441 - val_acc: 0.9916
Epoch 60/200
 - 1s - loss: 0.0468 - acc: 0.9901 - val_loss: 0.0440 - val_acc: 0.9916
Epoch 61/200
 - 1s - loss: 0.0472 - acc: 0.9899 - val_loss: 0.0440 - val_acc: 0.9916
Epoch 62/200
 - 1s - loss: 0.0473 - acc: 0.9892 - val_loss: 0.0439 - val_acc: 0.9916
Epoch 63/200
 - 1s - loss: 0.0467 - acc: 0.9901 - val_loss: 0.0438 - val_acc: 0.9916
Epoch 64/200
 - 1s - loss: 0.0472 - acc: 0.9895 - val_loss: 0.0438 - val_acc: 0.9916
Epoch 65/200
 - 1s - loss: 0.0472 - acc: 0.9898 - val_loss: 0.0437 - val_acc: 0.9916
Epoch 66/200
 - 1s - loss: 0.0465 - acc: 0.9905 - val_loss: 0.0437 - val_acc: 0.9916
Epoch 67/200
 - 1s - loss: 0.0466 - acc: 0.9897 - val_loss: 0.0436 - val_acc: 0.9916
Epoch 68/200
 - 1s - loss: 0.0471 - acc: 0.9901 - val_loss: 0.0435 - val_acc: 0.9916
Epoch 69/200
 - 1s - loss: 0.0468 - acc: 0.9895 - val_loss: 0.0435 - val_acc: 0.9916
Epoch 70/200
 - 1s - loss: 0.0468 - acc: 0.9900 - val_loss: 0.0434 - val_acc: 0.9916
Epoch 71/200
 - 1s - loss: 0.0466 - acc: 0.9902 - val_loss: 0.0434 - val_acc: 0.9916
Epoch 72/200
 - 1s - loss: 0.0463 - acc: 0.9899 - val_loss: 0.0433 - val_acc: 0.9916
Epoch 73/200
 - 1s - loss: 0.0462 - acc: 0.9899 - val_loss: 0.0433 - val_acc: 0.9916
Epoch 74/200
 - 1s - loss: 0.0464 - acc: 0.9897 - val_loss: 0.0432 - val_acc: 0.9916
Epoch 75/200
 - 1s - loss: 0.0467 - acc: 0.9897 - val_loss: 0.0432 - val_acc: 0.9916
Epoch 76/200
 - 1s - loss: 0.0462 - acc: 0.9899 - val_loss: 0.0431 - val_acc: 0.9916
Epoch 77/200
 - 1s - loss: 0.0462 - acc: 0.9897 - val_loss: 0.0431 - val_acc: 0.9916
Epoch 78/200
 - 1s - loss: 0.0457 - acc: 0.9902 - val_loss: 0.0430 - val_acc: 0.9916
Epoch 79/200
 - 1s - loss: 0.0460 - acc: 0.9895 - val_loss: 0.0430 - val_acc: 0.9916
Epoch 80/200
 - 1s - loss: 0.0464 - acc: 0.9904 - val_loss: 0.0429 - val_acc: 0.9916
Epoch 81/200
 - 1s - loss: 0.0465 - acc: 0.9896 - val_loss: 0.0429 - val_acc: 0.9916
Epoch 82/200
 - 1s - loss: 0.0463 - acc: 0.9895 - val_loss: 0.0428 - val_acc: 0.9916
Epoch 83/200
 - 1s - loss: 0.0458 - acc: 0.9897 - val_loss: 0.0428 - val_acc: 0.9916
Epoch 84/200
 - 1s - loss: 0.0461 - acc: 0.9902 - val_loss: 0.0428 - val_acc: 0.9916
Epoch 85/200
 - 1s - loss: 0.0459 - acc: 0.9899 - val_loss: 0.0427 - val_acc: 0.9916
Epoch 86/200
 - 1s - loss: 0.0456 - acc: 0.9901 - val_loss: 0.0427 - val_acc: 0.9916
Epoch 87/200
 - 1s - loss: 0.0458 - acc: 0.9905 - val_loss: 0.0426 - val_acc: 0.9916
Epoch 88/200
 - 1s - loss: 0.0454 - acc: 0.9899 - val_loss: 0.0426 - val_acc: 0.9916
Epoch 89/200
 - 1s - loss: 0.0456 - acc: 0.9899 - val_loss: 0.0425 - val_acc: 0.9916
Epoch 90/200
 - 1s - loss: 0.0457 - acc: 0.9900 - val_loss: 0.0425 - val_acc: 0.9916
Epoch 91/200
 - 1s - loss: 0.0458 - acc: 0.9896 - val_loss: 0.0425 - val_acc: 0.9916
Epoch 92/200
 - 1s - loss: 0.0454 - acc: 0.9899 - val_loss: 0.0424 - val_acc: 0.9916
Epoch 93/200
 - 1s - loss: 0.0457 - acc: 0.9897 - val_loss: 0.0424 - val_acc: 0.9916
Epoch 94/200
 - 1s - loss: 0.0454 - acc: 0.9898 - val_loss: 0.0423 - val_acc: 0.9916
Epoch 95/200
 - 1s - loss: 0.0453 - acc: 0.9907 - val_loss: 0.0423 - val_acc: 0.9916
Epoch 96/200
 - 1s - loss: 0.0457 - acc: 0.9899 - val_loss: 0.0423 - val_acc: 0.9916
Epoch 97/200
 - 1s - loss: 0.0452 - acc: 0.9901 - val_loss: 0.0422 - val_acc: 0.9916
Epoch 98/200
 - 1s - loss: 0.0456 - acc: 0.9896 - val_loss: 0.0422 - val_acc: 0.9918
Epoch 99/200
 - 1s - loss: 0.0451 - acc: 0.9904 - val_loss: 0.0422 - val_acc: 0.9918
Epoch 100/200
 - 1s - loss: 0.0453 - acc: 0.9902 - val_loss: 0.0421 - val_acc: 0.9918
Epoch 101/200
 - 1s - loss: 0.0452 - acc: 0.9896 - val_loss: 0.0421 - val_acc: 0.9918
Epoch 102/200
 - 1s - loss: 0.0451 - acc: 0.9907 - val_loss: 0.0421 - val_acc: 0.9916
Epoch 103/200
 - 1s - loss: 0.0451 - acc: 0.9906 - val_loss: 0.0420 - val_acc: 0.9918
Epoch 104/200
 - 1s - loss: 0.0450 - acc: 0.9899 - val_loss: 0.0420 - val_acc: 0.9918
Epoch 105/200
 - 1s - loss: 0.0451 - acc: 0.9900 - val_loss: 0.0420 - val_acc: 0.9918
Epoch 106/200
 - 1s - loss: 0.0448 - acc: 0.9901 - val_loss: 0.0419 - val_acc: 0.9918
Epoch 107/200
 - 1s - loss: 0.0448 - acc: 0.9906 - val_loss: 0.0419 - val_acc: 0.9918
Epoch 108/200
 - 1s - loss: 0.0452 - acc: 0.9903 - val_loss: 0.0419 - val_acc: 0.9918
Epoch 109/200
 - 1s - loss: 0.0451 - acc: 0.9898 - val_loss: 0.0418 - val_acc: 0.9918
Epoch 110/200
 - 1s - loss: 0.0448 - acc: 0.9900 - val_loss: 0.0418 - val_acc: 0.9918
Epoch 111/200
 - 1s - loss: 0.0448 - acc: 0.9902 - val_loss: 0.0418 - val_acc: 0.9918
Epoch 112/200
 - 1s - loss: 0.0448 - acc: 0.9901 - val_loss: 0.0417 - val_acc: 0.9918
Epoch 113/200
 - 1s - loss: 0.0451 - acc: 0.9905 - val_loss: 0.0417 - val_acc: 0.9918
Epoch 114/200
 - 1s - loss: 0.0447 - acc: 0.9903 - val_loss: 0.0417 - val_acc: 0.9918
Epoch 115/200
 - 1s - loss: 0.0448 - acc: 0.9901 - val_loss: 0.0417 - val_acc: 0.9918
Epoch 116/200
 - 1s - loss: 0.0447 - acc: 0.9900 - val_loss: 0.0416 - val_acc: 0.9918
Epoch 117/200
 - 1s - loss: 0.0449 - acc: 0.9905 - val_loss: 0.0416 - val_acc: 0.9918
Epoch 118/200
 - 1s - loss: 0.0447 - acc: 0.9905 - val_loss: 0.0416 - val_acc: 0.9918
Epoch 119/200
 - 1s - loss: 0.0448 - acc: 0.9902 - val_loss: 0.0415 - val_acc: 0.9918
Epoch 120/200
 - 1s - loss: 0.0449 - acc: 0.9899 - val_loss: 0.0415 - val_acc: 0.9920
Epoch 121/200
 - 1s - loss: 0.0447 - acc: 0.9905 - val_loss: 0.0415 - val_acc: 0.9920
Epoch 122/200
 - 1s - loss: 0.0448 - acc: 0.9903 - val_loss: 0.0415 - val_acc: 0.9920
Epoch 123/200
 - 1s - loss: 0.0449 - acc: 0.9901 - val_loss: 0.0414 - val_acc: 0.9920
Epoch 124/200
 - 1s - loss: 0.0444 - acc: 0.9902 - val_loss: 0.0414 - val_acc: 0.9920
Epoch 125/200
 - 1s - loss: 0.0449 - acc: 0.9899 - val_loss: 0.0414 - val_acc: 0.9920
Epoch 126/200
 - 1s - loss: 0.0440 - acc: 0.9908 - val_loss: 0.0413 - val_acc: 0.9920
Epoch 127/200
 - 1s - loss: 0.0445 - acc: 0.9908 - val_loss: 0.0413 - val_acc: 0.9920
Epoch 128/200
 - 1s - loss: 0.0443 - acc: 0.9908 - val_loss: 0.0413 - val_acc: 0.9920
Epoch 129/200
 - 1s - loss: 0.0445 - acc: 0.9900 - val_loss: 0.0413 - val_acc: 0.9920
Epoch 130/200
 - 1s - loss: 0.0444 - acc: 0.9901 - val_loss: 0.0412 - val_acc: 0.9920
Epoch 131/200
 - 1s - loss: 0.0446 - acc: 0.9899 - val_loss: 0.0412 - val_acc: 0.9920
Epoch 132/200
 - 1s - loss: 0.0445 - acc: 0.9896 - val_loss: 0.0412 - val_acc: 0.9920
Epoch 133/200
 - 1s - loss: 0.0444 - acc: 0.9903 - val_loss: 0.0412 - val_acc: 0.9920
Epoch 134/200
 - 1s - loss: 0.0441 - acc: 0.9901 - val_loss: 0.0411 - val_acc: 0.9920
Epoch 135/200
 - 1s - loss: 0.0443 - acc: 0.9905 - val_loss: 0.0411 - val_acc: 0.9920
Epoch 136/200
 - 1s - loss: 0.0443 - acc: 0.9903 - val_loss: 0.0411 - val_acc: 0.9920
Epoch 137/200
 - 1s - loss: 0.0441 - acc: 0.9902 - val_loss: 0.0411 - val_acc: 0.9920
Epoch 138/200
 - 1s - loss: 0.0439 - acc: 0.9905 - val_loss: 0.0411 - val_acc: 0.9920
Epoch 139/200
 - 1s - loss: 0.0444 - acc: 0.9900 - val_loss: 0.0410 - val_acc: 0.9920
Epoch 140/200
 - 1s - loss: 0.0443 - acc: 0.9905 - val_loss: 0.0410 - val_acc: 0.9920
Epoch 141/200
 - 1s - loss: 0.0439 - acc: 0.9901 - val_loss: 0.0410 - val_acc: 0.9920
Epoch 142/200
 - 1s - loss: 0.0440 - acc: 0.9902 - val_loss: 0.0410 - val_acc: 0.9920
Epoch 143/200
 - 1s - loss: 0.0438 - acc: 0.9902 - val_loss: 0.0409 - val_acc: 0.9920
Epoch 144/200
 - 1s - loss: 0.0439 - acc: 0.9904 - val_loss: 0.0409 - val_acc: 0.9920
Epoch 145/200
 - 1s - loss: 0.0443 - acc: 0.9899 - val_loss: 0.0409 - val_acc: 0.9920
Epoch 146/200
 - 1s - loss: 0.0440 - acc: 0.9901 - val_loss: 0.0409 - val_acc: 0.9920
Epoch 147/200
 - 1s - loss: 0.0441 - acc: 0.9905 - val_loss: 0.0409 - val_acc: 0.9918
Epoch 148/200
 - 1s - loss: 0.0436 - acc: 0.9909 - val_loss: 0.0408 - val_acc: 0.9918
Epoch 149/200
 - 1s - loss: 0.0437 - acc: 0.9905 - val_loss: 0.0408 - val_acc: 0.9918
Epoch 150/200
 - 1s - loss: 0.0436 - acc: 0.9910 - val_loss: 0.0408 - val_acc: 0.9918
Epoch 151/200
 - 1s - loss: 0.0441 - acc: 0.9905 - val_loss: 0.0408 - val_acc: 0.9918
Epoch 152/200
 - 1s - loss: 0.0437 - acc: 0.9909 - val_loss: 0.0408 - val_acc: 0.9918
Epoch 153/200
 - 1s - loss: 0.0436 - acc: 0.9911 - val_loss: 0.0407 - val_acc: 0.9918
Epoch 154/200
 - 1s - loss: 0.0442 - acc: 0.9902 - val_loss: 0.0407 - val_acc: 0.9918
Epoch 155/200
 - 1s - loss: 0.0433 - acc: 0.9906 - val_loss: 0.0407 - val_acc: 0.9918
Epoch 156/200
 - 1s - loss: 0.0438 - acc: 0.9901 - val_loss: 0.0407 - val_acc: 0.9918
Epoch 157/200
 - 1s - loss: 0.0434 - acc: 0.9906 - val_loss: 0.0407 - val_acc: 0.9918
Epoch 158/200
 - 1s - loss: 0.0440 - acc: 0.9901 - val_loss: 0.0406 - val_acc: 0.9918
Epoch 159/200
 - 1s - loss: 0.0437 - acc: 0.9904 - val_loss: 0.0406 - val_acc: 0.9918
Epoch 160/200
 - 1s - loss: 0.0437 - acc: 0.9904 - val_loss: 0.0406 - val_acc: 0.9918
Epoch 161/200
 - 1s - loss: 0.0433 - acc: 0.9905 - val_loss: 0.0406 - val_acc: 0.9918
Epoch 162/200
 - 1s - loss: 0.0436 - acc: 0.9904 - val_loss: 0.0406 - val_acc: 0.9918
Epoch 163/200
 - 1s - loss: 0.0442 - acc: 0.9900 - val_loss: 0.0405 - val_acc: 0.9918
Epoch 164/200
 - 1s - loss: 0.0442 - acc: 0.9902 - val_loss: 0.0405 - val_acc: 0.9918
Epoch 165/200
 - 1s - loss: 0.0437 - acc: 0.9904 - val_loss: 0.0405 - val_acc: 0.9918
Epoch 166/200
 - 1s - loss: 0.0439 - acc: 0.9902 - val_loss: 0.0405 - val_acc: 0.9918
Epoch 167/200
 - 1s - loss: 0.0438 - acc: 0.9899 - val_loss: 0.0405 - val_acc: 0.9918
Epoch 168/200
 - 1s - loss: 0.0440 - acc: 0.9902 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 169/200
 - 1s - loss: 0.0432 - acc: 0.9905 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 170/200
 - 1s - loss: 0.0440 - acc: 0.9908 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 171/200
 - 1s - loss: 0.0431 - acc: 0.9906 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 172/200
 - 1s - loss: 0.0434 - acc: 0.9906 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 173/200
 - 1s - loss: 0.0438 - acc: 0.9903 - val_loss: 0.0404 - val_acc: 0.9918
Epoch 174/200
 - 1s - loss: 0.0437 - acc: 0.9905 - val_loss: 0.0403 - val_acc: 0.9918
Epoch 175/200
 - 1s - loss: 0.0433 - acc: 0.9905 - val_loss: 0.0403 - val_acc: 0.9918
Epoch 176/200
 - 1s - loss: 0.0435 - acc: 0.9904 - val_loss: 0.0403 - val_acc: 0.9918
Epoch 177/200
 - 1s - loss: 0.0434 - acc: 0.9899 - val_loss: 0.0403 - val_acc: 0.9918
Epoch 178/200
 - 1s - loss: 0.0435 - acc: 0.9903 - val_loss: 0.0403 - val_acc: 0.9918
Epoch 179/200
 - 1s - loss: 0.0435 - acc: 0.9904 - val_loss: 0.0403 - val_acc: 0.9918
Epoch 180/200
 - 1s - loss: 0.0437 - acc: 0.9904 - val_loss: 0.0402 - val_acc: 0.9918
Epoch 181/200
 - 1s - loss: 0.0431 - acc: 0.9908 - val_loss: 0.0402 - val_acc: 0.9918
Epoch 182/200
 - 1s - loss: 0.0430 - acc: 0.9909 - val_loss: 0.0402 - val_acc: 0.9918
Epoch 183/200
 - 1s - loss: 0.0435 - acc: 0.9901 - val_loss: 0.0402 - val_acc: 0.9918
Epoch 184/200
 - 1s - loss: 0.0434 - acc: 0.9902 - val_loss: 0.0402 - val_acc: 0.9918
Epoch 185/200
 - 1s - loss: 0.0435 - acc: 0.9902 - val_loss: 0.0402 - val_acc: 0.9918
Epoch 186/200
 - 1s - loss: 0.0432 - acc: 0.9905 - val_loss: 0.0401 - val_acc: 0.9918
Epoch 187/200
 - 1s - loss: 0.0429 - acc: 0.9901 - val_loss: 0.0401 - val_acc: 0.9918
Epoch 188/200
 - 1s - loss: 0.0431 - acc: 0.9904 - val_loss: 0.0401 - val_acc: 0.9918
Epoch 189/200
 - 1s - loss: 0.0429 - acc: 0.9906 - val_loss: 0.0401 - val_acc: 0.9918
Epoch 190/200
 - 1s - loss: 0.0428 - acc: 0.9908 - val_loss: 0.0401 - val_acc: 0.9918
Epoch 191/200
 - 1s - loss: 0.0437 - acc: 0.9901 - val_loss: 0.0401 - val_acc: 0.9918
Epoch 192/200
 - 1s - loss: 0.0431 - acc: 0.9901 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 193/200
 - 1s - loss: 0.0432 - acc: 0.9905 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 194/200
 - 1s - loss: 0.0431 - acc: 0.9907 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 195/200
 - 1s - loss: 0.0432 - acc: 0.9902 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 196/200
 - 1s - loss: 0.0431 - acc: 0.9902 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 197/200
 - 1s - loss: 0.0430 - acc: 0.9905 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 198/200
 - 1s - loss: 0.0433 - acc: 0.9909 - val_loss: 0.0400 - val_acc: 0.9918
Epoch 199/200
 - 1s - loss: 0.0432 - acc: 0.9905 - val_loss: 0.0399 - val_acc: 0.9918
Epoch 200/200
 - 1s - loss: 0.0432 - acc: 0.9897 - val_loss: 0.0399 - val_acc: 0.9918
2018-03-27 09:23:31,456 [INFO] Evaluate...
2018-03-27 09:23:33,351 [INFO] Done!
2018-03-27 09:23:33,357 [INFO] tpe_transform took 0.002472 seconds
2018-03-27 09:23:33,359 [INFO] TPE using 12/12 trials with best loss 0.018868
2018-03-27 09:23:33,361 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:23:34,355 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0990 - acc: 0.9765 - val_loss: 0.0554 - val_acc: 0.9888
Epoch 2/200
 - 1s - loss: 0.0536 - acc: 0.9878 - val_loss: 0.0473 - val_acc: 0.9890
Epoch 3/200
 - 1s - loss: 0.0482 - acc: 0.9889 - val_loss: 0.0439 - val_acc: 0.9898
Epoch 4/200
 - 1s - loss: 0.0457 - acc: 0.9893 - val_loss: 0.0419 - val_acc: 0.9900
Epoch 5/200
 - 1s - loss: 0.0439 - acc: 0.9890 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 6/200
 - 1s - loss: 0.0425 - acc: 0.9900 - val_loss: 0.0395 - val_acc: 0.9906
Epoch 7/200
 - 1s - loss: 0.0414 - acc: 0.9900 - val_loss: 0.0387 - val_acc: 0.9908
Epoch 8/200
 - 1s - loss: 0.0403 - acc: 0.9903 - val_loss: 0.0380 - val_acc: 0.9910
Epoch 9/200
 - 1s - loss: 0.0402 - acc: 0.9897 - val_loss: 0.0375 - val_acc: 0.9910
Epoch 10/200
 - 1s - loss: 0.0399 - acc: 0.9895 - val_loss: 0.0371 - val_acc: 0.9910
Epoch 11/200
 - 1s - loss: 0.0387 - acc: 0.9909 - val_loss: 0.0367 - val_acc: 0.9912
Epoch 12/200
 - 1s - loss: 0.0385 - acc: 0.9910 - val_loss: 0.0363 - val_acc: 0.9914
Epoch 13/200
 - 1s - loss: 0.0384 - acc: 0.9905 - val_loss: 0.0360 - val_acc: 0.9918
Epoch 14/200
 - 1s - loss: 0.0376 - acc: 0.9909 - val_loss: 0.0357 - val_acc: 0.9914
Epoch 15/200
 - 1s - loss: 0.0377 - acc: 0.9906 - val_loss: 0.0355 - val_acc: 0.9920
Epoch 16/200
 - 1s - loss: 0.0370 - acc: 0.9911 - val_loss: 0.0352 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0374 - acc: 0.9906 - val_loss: 0.0350 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0369 - acc: 0.9908 - val_loss: 0.0348 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0368 - acc: 0.9906 - val_loss: 0.0346 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0362 - acc: 0.9906 - val_loss: 0.0345 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0366 - acc: 0.9907 - val_loss: 0.0343 - val_acc: 0.9920
Epoch 22/200
 - 1s - loss: 0.0364 - acc: 0.9910 - val_loss: 0.0342 - val_acc: 0.9922
Epoch 23/200
 - 1s - loss: 0.0359 - acc: 0.9909 - val_loss: 0.0340 - val_acc: 0.9922
Epoch 24/200
 - 1s - loss: 0.0361 - acc: 0.9909 - val_loss: 0.0339 - val_acc: 0.9922
Epoch 25/200
 - 1s - loss: 0.0359 - acc: 0.9905 - val_loss: 0.0338 - val_acc: 0.9924
Epoch 26/200
 - 1s - loss: 0.0355 - acc: 0.9909 - val_loss: 0.0337 - val_acc: 0.9924
Epoch 27/200
 - 1s - loss: 0.0358 - acc: 0.9906 - val_loss: 0.0336 - val_acc: 0.9924
Epoch 28/200
 - 1s - loss: 0.0356 - acc: 0.9914 - val_loss: 0.0334 - val_acc: 0.9926
Epoch 29/200
 - 1s - loss: 0.0354 - acc: 0.9908 - val_loss: 0.0333 - val_acc: 0.9926
Epoch 30/200
 - 1s - loss: 0.0354 - acc: 0.9906 - val_loss: 0.0332 - val_acc: 0.9926
Epoch 31/200
 - 1s - loss: 0.0355 - acc: 0.9909 - val_loss: 0.0332 - val_acc: 0.9926
Epoch 32/200
 - 1s - loss: 0.0347 - acc: 0.9914 - val_loss: 0.0331 - val_acc: 0.9926
Epoch 33/200
 - 1s - loss: 0.0347 - acc: 0.9910 - val_loss: 0.0330 - val_acc: 0.9926
Epoch 34/200
 - 1s - loss: 0.0349 - acc: 0.9913 - val_loss: 0.0329 - val_acc: 0.9926
Epoch 35/200
 - 1s - loss: 0.0346 - acc: 0.9910 - val_loss: 0.0328 - val_acc: 0.9926
Epoch 36/200
 - 1s - loss: 0.0344 - acc: 0.9911 - val_loss: 0.0328 - val_acc: 0.9928
Epoch 37/200
 - 1s - loss: 0.0346 - acc: 0.9914 - val_loss: 0.0327 - val_acc: 0.9928
Epoch 38/200
 - 1s - loss: 0.0343 - acc: 0.9910 - val_loss: 0.0326 - val_acc: 0.9928
Epoch 39/200
 - 1s - loss: 0.0346 - acc: 0.9911 - val_loss: 0.0325 - val_acc: 0.9928
Epoch 40/200
 - 1s - loss: 0.0341 - acc: 0.9914 - val_loss: 0.0325 - val_acc: 0.9928
Epoch 41/200
 - 1s - loss: 0.0345 - acc: 0.9912 - val_loss: 0.0324 - val_acc: 0.9928
Epoch 42/200
 - 1s - loss: 0.0341 - acc: 0.9910 - val_loss: 0.0324 - val_acc: 0.9928
Epoch 43/200
 - 1s - loss: 0.0339 - acc: 0.9914 - val_loss: 0.0323 - val_acc: 0.9928
Epoch 44/200
 - 1s - loss: 0.0337 - acc: 0.9915 - val_loss: 0.0322 - val_acc: 0.9928
Epoch 45/200
 - 1s - loss: 0.0334 - acc: 0.9917 - val_loss: 0.0322 - val_acc: 0.9928
Epoch 46/200
 - 1s - loss: 0.0335 - acc: 0.9922 - val_loss: 0.0321 - val_acc: 0.9928
Epoch 47/200
 - 1s - loss: 0.0337 - acc: 0.9914 - val_loss: 0.0321 - val_acc: 0.9928
Epoch 48/200
 - 1s - loss: 0.0339 - acc: 0.9905 - val_loss: 0.0320 - val_acc: 0.9928
Epoch 49/200
 - 1s - loss: 0.0339 - acc: 0.9912 - val_loss: 0.0320 - val_acc: 0.9928
Epoch 50/200
 - 1s - loss: 0.0337 - acc: 0.9914 - val_loss: 0.0319 - val_acc: 0.9928
Epoch 51/200
 - 1s - loss: 0.0339 - acc: 0.9913 - val_loss: 0.0319 - val_acc: 0.9928
Epoch 52/200
 - 1s - loss: 0.0334 - acc: 0.9915 - val_loss: 0.0318 - val_acc: 0.9928
Epoch 53/200
 - 1s - loss: 0.0337 - acc: 0.9910 - val_loss: 0.0318 - val_acc: 0.9928
Epoch 54/200
 - 1s - loss: 0.0334 - acc: 0.9913 - val_loss: 0.0318 - val_acc: 0.9928
Epoch 55/200
 - 1s - loss: 0.0335 - acc: 0.9914 - val_loss: 0.0317 - val_acc: 0.9928
Epoch 56/200
 - 1s - loss: 0.0334 - acc: 0.9913 - val_loss: 0.0317 - val_acc: 0.9928
Epoch 57/200
 - 1s - loss: 0.0335 - acc: 0.9915 - val_loss: 0.0316 - val_acc: 0.9928
Epoch 58/200
 - 1s - loss: 0.0334 - acc: 0.9914 - val_loss: 0.0316 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0332 - acc: 0.9916 - val_loss: 0.0316 - val_acc: 0.9928
Epoch 60/200
 - 1s - loss: 0.0333 - acc: 0.9911 - val_loss: 0.0315 - val_acc: 0.9928
Epoch 61/200
 - 1s - loss: 0.0328 - acc: 0.9915 - val_loss: 0.0315 - val_acc: 0.9928
Epoch 62/200
 - 1s - loss: 0.0329 - acc: 0.9911 - val_loss: 0.0314 - val_acc: 0.9928
Epoch 63/200
 - 1s - loss: 0.0332 - acc: 0.9918 - val_loss: 0.0314 - val_acc: 0.9928
Epoch 64/200
 - 1s - loss: 0.0328 - acc: 0.9914 - val_loss: 0.0314 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0333 - acc: 0.9913 - val_loss: 0.0313 - val_acc: 0.9928
Epoch 66/200
 - 1s - loss: 0.0330 - acc: 0.9914 - val_loss: 0.0313 - val_acc: 0.9928
Epoch 67/200
 - 1s - loss: 0.0329 - acc: 0.9914 - val_loss: 0.0313 - val_acc: 0.9928
Epoch 68/200
 - 1s - loss: 0.0329 - acc: 0.9919 - val_loss: 0.0312 - val_acc: 0.9928
Epoch 69/200
 - 1s - loss: 0.0329 - acc: 0.9913 - val_loss: 0.0312 - val_acc: 0.9928
Epoch 70/200
 - 1s - loss: 0.0329 - acc: 0.9910 - val_loss: 0.0312 - val_acc: 0.9928
Epoch 71/200
 - 1s - loss: 0.0328 - acc: 0.9911 - val_loss: 0.0312 - val_acc: 0.9928
Epoch 72/200
 - 1s - loss: 0.0323 - acc: 0.9915 - val_loss: 0.0311 - val_acc: 0.9928
Epoch 73/200
 - 1s - loss: 0.0325 - acc: 0.9914 - val_loss: 0.0311 - val_acc: 0.9928
Epoch 74/200
 - 1s - loss: 0.0328 - acc: 0.9911 - val_loss: 0.0311 - val_acc: 0.9928
Epoch 75/200
 - 1s - loss: 0.0324 - acc: 0.9919 - val_loss: 0.0310 - val_acc: 0.9928
Epoch 76/200
 - 1s - loss: 0.0328 - acc: 0.9918 - val_loss: 0.0310 - val_acc: 0.9928
Epoch 77/200
 - 1s - loss: 0.0324 - acc: 0.9915 - val_loss: 0.0310 - val_acc: 0.9928
Epoch 78/200
 - 1s - loss: 0.0324 - acc: 0.9919 - val_loss: 0.0310 - val_acc: 0.9928
Epoch 79/200
 - 1s - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0323 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 81/200
 - 1s - loss: 0.0326 - acc: 0.9917 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 82/200
 - 1s - loss: 0.0326 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 83/200
 - 1s - loss: 0.0325 - acc: 0.9914 - val_loss: 0.0308 - val_acc: 0.9928
Epoch 84/200
 - 1s - loss: 0.0326 - acc: 0.9910 - val_loss: 0.0308 - val_acc: 0.9928
Epoch 85/200
 - 1s - loss: 0.0325 - acc: 0.9919 - val_loss: 0.0308 - val_acc: 0.9928
Epoch 86/200
 - 1s - loss: 0.0325 - acc: 0.9914 - val_loss: 0.0308 - val_acc: 0.9928
Epoch 87/200
 - 1s - loss: 0.0326 - acc: 0.9913 - val_loss: 0.0307 - val_acc: 0.9928
Epoch 88/200
 - 1s - loss: 0.0324 - acc: 0.9910 - val_loss: 0.0307 - val_acc: 0.9928
Epoch 89/200
 - 1s - loss: 0.0322 - acc: 0.9919 - val_loss: 0.0307 - val_acc: 0.9928
Epoch 90/200
 - 1s - loss: 0.0322 - acc: 0.9919 - val_loss: 0.0307 - val_acc: 0.9928
Epoch 91/200
 - 1s - loss: 0.0326 - acc: 0.9911 - val_loss: 0.0306 - val_acc: 0.9928
Epoch 92/200
 - 1s - loss: 0.0326 - acc: 0.9910 - val_loss: 0.0306 - val_acc: 0.9928
Epoch 93/200
 - 1s - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0306 - val_acc: 0.9928
Epoch 94/200
 - 1s - loss: 0.0326 - acc: 0.9915 - val_loss: 0.0306 - val_acc: 0.9928
Epoch 95/200
 - 1s - loss: 0.0325 - acc: 0.9918 - val_loss: 0.0306 - val_acc: 0.9928
Epoch 96/200
 - 1s - loss: 0.0318 - acc: 0.9913 - val_loss: 0.0305 - val_acc: 0.9928
Epoch 97/200
 - 1s - loss: 0.0319 - acc: 0.9914 - val_loss: 0.0305 - val_acc: 0.9928
Epoch 98/200
 - 1s - loss: 0.0317 - acc: 0.9916 - val_loss: 0.0305 - val_acc: 0.9928
Epoch 99/200
 - 1s - loss: 0.0319 - acc: 0.9917 - val_loss: 0.0305 - val_acc: 0.9928
Epoch 100/200
 - 1s - loss: 0.0316 - acc: 0.9919 - val_loss: 0.0305 - val_acc: 0.9928
Epoch 101/200
 - 1s - loss: 0.0321 - acc: 0.9917 - val_loss: 0.0304 - val_acc: 0.9928
Epoch 102/200
 - 1s - loss: 0.0318 - acc: 0.9919 - val_loss: 0.0304 - val_acc: 0.9928
Epoch 103/200
 - 1s - loss: 0.0318 - acc: 0.9917 - val_loss: 0.0304 - val_acc: 0.9928
Epoch 104/200
 - 1s - loss: 0.0319 - acc: 0.9916 - val_loss: 0.0304 - val_acc: 0.9928
Epoch 105/200
 - 1s - loss: 0.0322 - acc: 0.9913 - val_loss: 0.0304 - val_acc: 0.9928
Epoch 106/200
 - 1s - loss: 0.0315 - acc: 0.9917 - val_loss: 0.0304 - val_acc: 0.9928
Epoch 107/200
 - 1s - loss: 0.0317 - acc: 0.9918 - val_loss: 0.0303 - val_acc: 0.9928
Epoch 108/200
 - 1s - loss: 0.0323 - acc: 0.9916 - val_loss: 0.0303 - val_acc: 0.9928
Epoch 109/200
 - 1s - loss: 0.0317 - acc: 0.9916 - val_loss: 0.0303 - val_acc: 0.9928
Epoch 110/200
 - 1s - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0303 - val_acc: 0.9928
Epoch 111/200
 - 1s - loss: 0.0317 - acc: 0.9917 - val_loss: 0.0303 - val_acc: 0.9928
Epoch 112/200
 - 1s - loss: 0.0318 - acc: 0.9919 - val_loss: 0.0303 - val_acc: 0.9928
Epoch 113/200
 - 1s - loss: 0.0317 - acc: 0.9916 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 114/200
 - 1s - loss: 0.0321 - acc: 0.9911 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 115/200
 - 1s - loss: 0.0319 - acc: 0.9919 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 116/200
 - 1s - loss: 0.0319 - acc: 0.9919 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 117/200
 - 1s - loss: 0.0318 - acc: 0.9911 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 118/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 119/200
 - 1s - loss: 0.0314 - acc: 0.9920 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 120/200
 - 1s - loss: 0.0313 - acc: 0.9914 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 121/200
 - 1s - loss: 0.0316 - acc: 0.9915 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 122/200
 - 1s - loss: 0.0319 - acc: 0.9911 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 123/200
 - 1s - loss: 0.0314 - acc: 0.9913 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 124/200
 - 1s - loss: 0.0316 - acc: 0.9917 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 125/200
 - 1s - loss: 0.0313 - acc: 0.9921 - val_loss: 0.0301 - val_acc: 0.9928
Epoch 126/200
 - 1s - loss: 0.0319 - acc: 0.9914 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 127/200
 - 1s - loss: 0.0312 - acc: 0.9919 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 128/200
 - 1s - loss: 0.0313 - acc: 0.9922 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 129/200
 - 1s - loss: 0.0315 - acc: 0.9912 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 130/200
 - 1s - loss: 0.0317 - acc: 0.9913 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 131/200
 - 1s - loss: 0.0314 - acc: 0.9919 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 132/200
 - 1s - loss: 0.0315 - acc: 0.9914 - val_loss: 0.0300 - val_acc: 0.9928
Epoch 133/200
 - 1s - loss: 0.0309 - acc: 0.9922 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 134/200
 - 1s - loss: 0.0314 - acc: 0.9919 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 135/200
 - 1s - loss: 0.0311 - acc: 0.9917 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 136/200
 - 1s - loss: 0.0314 - acc: 0.9919 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 137/200
 - 1s - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 138/200
 - 1s - loss: 0.0312 - acc: 0.9917 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 139/200
 - 1s - loss: 0.0313 - acc: 0.9915 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 140/200
 - 1s - loss: 0.0313 - acc: 0.9916 - val_loss: 0.0299 - val_acc: 0.9928
Epoch 141/200
 - 1s - loss: 0.0314 - acc: 0.9919 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 142/200
 - 1s - loss: 0.0312 - acc: 0.9918 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 143/200
 - 1s - loss: 0.0310 - acc: 0.9917 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 144/200
 - 1s - loss: 0.0313 - acc: 0.9919 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 145/200
 - 1s - loss: 0.0310 - acc: 0.9918 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 146/200
 - 1s - loss: 0.0311 - acc: 0.9917 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 147/200
 - 1s - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 148/200
 - 1s - loss: 0.0314 - acc: 0.9918 - val_loss: 0.0298 - val_acc: 0.9928
Epoch 149/200
 - 1s - loss: 0.0314 - acc: 0.9920 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 150/200
 - 1s - loss: 0.0314 - acc: 0.9915 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 151/200
 - 1s - loss: 0.0311 - acc: 0.9917 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 152/200
 - 1s - loss: 0.0305 - acc: 0.9918 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 153/200
 - 1s - loss: 0.0312 - acc: 0.9920 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 154/200
 - 1s - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 155/200
 - 1s - loss: 0.0314 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 156/200
 - 1s - loss: 0.0309 - acc: 0.9918 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 157/200
 - 1s - loss: 0.0310 - acc: 0.9918 - val_loss: 0.0297 - val_acc: 0.9928
Epoch 158/200
 - 1s - loss: 0.0309 - acc: 0.9919 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 159/200
 - 1s - loss: 0.0309 - acc: 0.9918 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 160/200
 - 1s - loss: 0.0312 - acc: 0.9921 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 161/200
 - 1s - loss: 0.0310 - acc: 0.9917 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 162/200
 - 1s - loss: 0.0309 - acc: 0.9919 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 163/200
 - 1s - loss: 0.0308 - acc: 0.9918 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 164/200
 - 1s - loss: 0.0313 - acc: 0.9916 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 165/200
 - 1s - loss: 0.0309 - acc: 0.9921 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 166/200
 - 1s - loss: 0.0309 - acc: 0.9919 - val_loss: 0.0296 - val_acc: 0.9928
Epoch 167/200
 - 1s - loss: 0.0311 - acc: 0.9916 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 168/200
 - 1s - loss: 0.0314 - acc: 0.9918 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 169/200
 - 1s - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 170/200
 - 1s - loss: 0.0309 - acc: 0.9911 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 171/200
 - 1s - loss: 0.0313 - acc: 0.9918 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 172/200
 - 1s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 173/200
 - 1s - loss: 0.0306 - acc: 0.9922 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 174/200
 - 1s - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 175/200
 - 1s - loss: 0.0309 - acc: 0.9920 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 176/200
 - 1s - loss: 0.0307 - acc: 0.9918 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 177/200
 - 1s - loss: 0.0310 - acc: 0.9919 - val_loss: 0.0295 - val_acc: 0.9928
Epoch 178/200
 - 1s - loss: 0.0313 - acc: 0.9913 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 179/200
 - 1s - loss: 0.0307 - acc: 0.9918 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 180/200
 - 1s - loss: 0.0306 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 181/200
 - 1s - loss: 0.0308 - acc: 0.9920 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 182/200
 - 1s - loss: 0.0309 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 183/200
 - 1s - loss: 0.0308 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 184/200
 - 1s - loss: 0.0308 - acc: 0.9923 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 185/200
 - 1s - loss: 0.0306 - acc: 0.9918 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 186/200
 - 1s - loss: 0.0307 - acc: 0.9922 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 187/200
 - 1s - loss: 0.0308 - acc: 0.9922 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 188/200
 - 1s - loss: 0.0305 - acc: 0.9917 - val_loss: 0.0294 - val_acc: 0.9928
Epoch 189/200
 - 1s - loss: 0.0304 - acc: 0.9922 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 190/200
 - 1s - loss: 0.0306 - acc: 0.9919 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 191/200
 - 1s - loss: 0.0305 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 192/200
 - 1s - loss: 0.0305 - acc: 0.9920 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 193/200
 - 1s - loss: 0.0305 - acc: 0.9920 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 194/200
 - 1s - loss: 0.0304 - acc: 0.9920 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 195/200
 - 1s - loss: 0.0307 - acc: 0.9919 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 196/200
 - 1s - loss: 0.0308 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 197/200
 - 1s - loss: 0.0304 - acc: 0.9919 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 198/200
 - 1s - loss: 0.0302 - acc: 0.9920 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 199/200
 - 1s - loss: 0.0306 - acc: 0.9920 - val_loss: 0.0293 - val_acc: 0.9928
Epoch 200/200
 - 1s - loss: 0.0305 - acc: 0.9922 - val_loss: 0.0293 - val_acc: 0.9928
2018-03-27 09:26:34,514 [INFO] Evaluate...
2018-03-27 09:26:36,435 [INFO] Done!
2018-03-27 09:26:36,442 [INFO] tpe_transform took 0.003231 seconds
2018-03-27 09:26:36,442 [INFO] TPE using 13/13 trials with best loss 0.018868
2018-03-27 09:26:36,445 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:26:37,442 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.1694 - acc: 0.9563 - val_loss: 0.0903 - val_acc: 0.9840
Epoch 2/200
 - 1s - loss: 0.0937 - acc: 0.9828 - val_loss: 0.0778 - val_acc: 0.9856
Epoch 3/200
 - 1s - loss: 0.0851 - acc: 0.9830 - val_loss: 0.0722 - val_acc: 0.9864
Epoch 4/200
 - 1s - loss: 0.0800 - acc: 0.9833 - val_loss: 0.0688 - val_acc: 0.9866
Epoch 5/200
 - 1s - loss: 0.0768 - acc: 0.9843 - val_loss: 0.0665 - val_acc: 0.9868
Epoch 6/200
 - 1s - loss: 0.0748 - acc: 0.9848 - val_loss: 0.0647 - val_acc: 0.9874
Epoch 7/200
 - 1s - loss: 0.0737 - acc: 0.9845 - val_loss: 0.0634 - val_acc: 0.9876
Epoch 8/200
 - 1s - loss: 0.0716 - acc: 0.9852 - val_loss: 0.0622 - val_acc: 0.9878
Epoch 9/200
 - 1s - loss: 0.0712 - acc: 0.9851 - val_loss: 0.0613 - val_acc: 0.9878
Epoch 10/200
 - 1s - loss: 0.0702 - acc: 0.9843 - val_loss: 0.0605 - val_acc: 0.9884
Epoch 11/200
 - 1s - loss: 0.0687 - acc: 0.9856 - val_loss: 0.0598 - val_acc: 0.9884
Epoch 12/200
 - 1s - loss: 0.0693 - acc: 0.9841 - val_loss: 0.0591 - val_acc: 0.9884
Epoch 13/200
 - 1s - loss: 0.0693 - acc: 0.9848 - val_loss: 0.0586 - val_acc: 0.9888
Epoch 14/200
 - 1s - loss: 0.0675 - acc: 0.9856 - val_loss: 0.0580 - val_acc: 0.9888
Epoch 15/200
 - 1s - loss: 0.0661 - acc: 0.9859 - val_loss: 0.0576 - val_acc: 0.9892
Epoch 16/200
 - 1s - loss: 0.0663 - acc: 0.9855 - val_loss: 0.0572 - val_acc: 0.9892
Epoch 17/200
 - 1s - loss: 0.0651 - acc: 0.9867 - val_loss: 0.0568 - val_acc: 0.9894
Epoch 18/200
 - 1s - loss: 0.0649 - acc: 0.9859 - val_loss: 0.0564 - val_acc: 0.9898
Epoch 19/200
 - 1s - loss: 0.0646 - acc: 0.9861 - val_loss: 0.0561 - val_acc: 0.9898
Epoch 20/200
 - 1s - loss: 0.0641 - acc: 0.9864 - val_loss: 0.0558 - val_acc: 0.9900
Epoch 21/200
 - 1s - loss: 0.0638 - acc: 0.9859 - val_loss: 0.0555 - val_acc: 0.9900
Epoch 22/200
 - 1s - loss: 0.0644 - acc: 0.9865 - val_loss: 0.0552 - val_acc: 0.9902
Epoch 23/200
 - 1s - loss: 0.0637 - acc: 0.9856 - val_loss: 0.0550 - val_acc: 0.9902
Epoch 24/200
 - 1s - loss: 0.0635 - acc: 0.9857 - val_loss: 0.0547 - val_acc: 0.9902
Epoch 25/200
 - 1s - loss: 0.0639 - acc: 0.9855 - val_loss: 0.0545 - val_acc: 0.9902
Epoch 26/200
 - 1s - loss: 0.0638 - acc: 0.9848 - val_loss: 0.0543 - val_acc: 0.9902
Epoch 27/200
 - 1s - loss: 0.0625 - acc: 0.9860 - val_loss: 0.0541 - val_acc: 0.9902
Epoch 28/200
 - 1s - loss: 0.0628 - acc: 0.9857 - val_loss: 0.0539 - val_acc: 0.9902
Epoch 29/200
 - 1s - loss: 0.0624 - acc: 0.9859 - val_loss: 0.0537 - val_acc: 0.9902
Epoch 30/200
 - 1s - loss: 0.0615 - acc: 0.9864 - val_loss: 0.0535 - val_acc: 0.9902
Epoch 31/200
 - 1s - loss: 0.0623 - acc: 0.9855 - val_loss: 0.0533 - val_acc: 0.9902
Epoch 32/200
 - 1s - loss: 0.0621 - acc: 0.9857 - val_loss: 0.0532 - val_acc: 0.9902
Epoch 33/200
 - 1s - loss: 0.0612 - acc: 0.9859 - val_loss: 0.0530 - val_acc: 0.9902
Epoch 34/200
 - 1s - loss: 0.0613 - acc: 0.9862 - val_loss: 0.0528 - val_acc: 0.9902
Epoch 35/200
 - 1s - loss: 0.0611 - acc: 0.9860 - val_loss: 0.0527 - val_acc: 0.9904
Epoch 36/200
 - 1s - loss: 0.0612 - acc: 0.9862 - val_loss: 0.0526 - val_acc: 0.9906
Epoch 37/200
 - 1s - loss: 0.0613 - acc: 0.9859 - val_loss: 0.0524 - val_acc: 0.9906
Epoch 38/200
 - 1s - loss: 0.0613 - acc: 0.9861 - val_loss: 0.0523 - val_acc: 0.9906
Epoch 39/200
 - 1s - loss: 0.0611 - acc: 0.9854 - val_loss: 0.0522 - val_acc: 0.9906
Epoch 40/200
 - 1s - loss: 0.0602 - acc: 0.9862 - val_loss: 0.0520 - val_acc: 0.9906
Epoch 41/200
 - 1s - loss: 0.0607 - acc: 0.9859 - val_loss: 0.0519 - val_acc: 0.9906
Epoch 42/200
 - 1s - loss: 0.0606 - acc: 0.9865 - val_loss: 0.0518 - val_acc: 0.9906
Epoch 43/200
 - 1s - loss: 0.0598 - acc: 0.9863 - val_loss: 0.0517 - val_acc: 0.9906
Epoch 44/200
 - 1s - loss: 0.0599 - acc: 0.9861 - val_loss: 0.0516 - val_acc: 0.9906
Epoch 45/200
 - 1s - loss: 0.0600 - acc: 0.9858 - val_loss: 0.0515 - val_acc: 0.9906
Epoch 46/200
 - 1s - loss: 0.0596 - acc: 0.9869 - val_loss: 0.0514 - val_acc: 0.9906
Epoch 47/200
 - 1s - loss: 0.0603 - acc: 0.9860 - val_loss: 0.0513 - val_acc: 0.9906
Epoch 48/200
 - 1s - loss: 0.0603 - acc: 0.9863 - val_loss: 0.0512 - val_acc: 0.9906
Epoch 49/200
 - 1s - loss: 0.0597 - acc: 0.9864 - val_loss: 0.0511 - val_acc: 0.9906
Epoch 50/200
 - 1s - loss: 0.0596 - acc: 0.9860 - val_loss: 0.0510 - val_acc: 0.9906
Epoch 51/200
 - 1s - loss: 0.0588 - acc: 0.9874 - val_loss: 0.0509 - val_acc: 0.9906
Epoch 52/200
 - 1s - loss: 0.0586 - acc: 0.9873 - val_loss: 0.0508 - val_acc: 0.9906
Epoch 53/200
 - 1s - loss: 0.0585 - acc: 0.9872 - val_loss: 0.0507 - val_acc: 0.9906
Epoch 54/200
 - 1s - loss: 0.0587 - acc: 0.9867 - val_loss: 0.0506 - val_acc: 0.9906
Epoch 55/200
 - 1s - loss: 0.0584 - acc: 0.9869 - val_loss: 0.0505 - val_acc: 0.9906
Epoch 56/200
 - 1s - loss: 0.0584 - acc: 0.9866 - val_loss: 0.0504 - val_acc: 0.9906
Epoch 57/200
 - 1s - loss: 0.0590 - acc: 0.9866 - val_loss: 0.0504 - val_acc: 0.9906
Epoch 58/200
 - 1s - loss: 0.0592 - acc: 0.9863 - val_loss: 0.0503 - val_acc: 0.9906
Epoch 59/200
 - 1s - loss: 0.0595 - acc: 0.9859 - val_loss: 0.0502 - val_acc: 0.9906
Epoch 60/200
 - 1s - loss: 0.0586 - acc: 0.9862 - val_loss: 0.0501 - val_acc: 0.9906
Epoch 61/200
 - 1s - loss: 0.0575 - acc: 0.9872 - val_loss: 0.0501 - val_acc: 0.9906
Epoch 62/200
 - 1s - loss: 0.0574 - acc: 0.9860 - val_loss: 0.0500 - val_acc: 0.9906
Epoch 63/200
 - 1s - loss: 0.0587 - acc: 0.9863 - val_loss: 0.0499 - val_acc: 0.9906
Epoch 64/200
 - 1s - loss: 0.0582 - acc: 0.9866 - val_loss: 0.0499 - val_acc: 0.9906
Epoch 65/200
 - 1s - loss: 0.0584 - acc: 0.9868 - val_loss: 0.0498 - val_acc: 0.9906
Epoch 66/200
 - 1s - loss: 0.0584 - acc: 0.9866 - val_loss: 0.0497 - val_acc: 0.9906
Epoch 67/200
 - 1s - loss: 0.0578 - acc: 0.9865 - val_loss: 0.0497 - val_acc: 0.9906
Epoch 68/200
 - 1s - loss: 0.0585 - acc: 0.9863 - val_loss: 0.0496 - val_acc: 0.9906
Epoch 69/200
 - 1s - loss: 0.0583 - acc: 0.9867 - val_loss: 0.0495 - val_acc: 0.9906
Epoch 70/200
 - 1s - loss: 0.0578 - acc: 0.9867 - val_loss: 0.0495 - val_acc: 0.9906
Epoch 71/200
 - 1s - loss: 0.0575 - acc: 0.9867 - val_loss: 0.0494 - val_acc: 0.9906
Epoch 72/200
 - 1s - loss: 0.0578 - acc: 0.9865 - val_loss: 0.0494 - val_acc: 0.9906
Epoch 73/200
 - 1s - loss: 0.0571 - acc: 0.9868 - val_loss: 0.0493 - val_acc: 0.9906
Epoch 74/200
 - 1s - loss: 0.0573 - acc: 0.9865 - val_loss: 0.0492 - val_acc: 0.9906
Epoch 75/200
 - 1s - loss: 0.0578 - acc: 0.9863 - val_loss: 0.0492 - val_acc: 0.9906
Epoch 76/200
 - 1s - loss: 0.0573 - acc: 0.9868 - val_loss: 0.0491 - val_acc: 0.9908
Epoch 77/200
 - 1s - loss: 0.0573 - acc: 0.9869 - val_loss: 0.0491 - val_acc: 0.9908
Epoch 78/200
 - 1s - loss: 0.0572 - acc: 0.9869 - val_loss: 0.0490 - val_acc: 0.9908
Epoch 79/200
 - 1s - loss: 0.0572 - acc: 0.9862 - val_loss: 0.0490 - val_acc: 0.9908
Epoch 80/200
 - 1s - loss: 0.0575 - acc: 0.9861 - val_loss: 0.0489 - val_acc: 0.9908
Epoch 81/200
 - 1s - loss: 0.0567 - acc: 0.9872 - val_loss: 0.0489 - val_acc: 0.9908
Epoch 82/200
 - 1s - loss: 0.0570 - acc: 0.9869 - val_loss: 0.0488 - val_acc: 0.9908
Epoch 83/200
 - 1s - loss: 0.0569 - acc: 0.9858 - val_loss: 0.0488 - val_acc: 0.9908
Epoch 84/200
 - 1s - loss: 0.0568 - acc: 0.9870 - val_loss: 0.0487 - val_acc: 0.9910
Epoch 85/200
 - 1s - loss: 0.0567 - acc: 0.9874 - val_loss: 0.0487 - val_acc: 0.9910
Epoch 86/200
 - 1s - loss: 0.0568 - acc: 0.9873 - val_loss: 0.0486 - val_acc: 0.9910
Epoch 87/200
 - 1s - loss: 0.0569 - acc: 0.9871 - val_loss: 0.0486 - val_acc: 0.9910
Epoch 88/200
 - 1s - loss: 0.0564 - acc: 0.9876 - val_loss: 0.0485 - val_acc: 0.9910
Epoch 89/200
 - 1s - loss: 0.0565 - acc: 0.9869 - val_loss: 0.0485 - val_acc: 0.9910
Epoch 90/200
 - 1s - loss: 0.0570 - acc: 0.9866 - val_loss: 0.0484 - val_acc: 0.9910
Epoch 91/200
 - 1s - loss: 0.0571 - acc: 0.9873 - val_loss: 0.0484 - val_acc: 0.9910
Epoch 92/200
 - 1s - loss: 0.0566 - acc: 0.9871 - val_loss: 0.0484 - val_acc: 0.9910
Epoch 93/200
 - 1s - loss: 0.0559 - acc: 0.9876 - val_loss: 0.0483 - val_acc: 0.9910
Epoch 94/200
 - 1s - loss: 0.0561 - acc: 0.9871 - val_loss: 0.0483 - val_acc: 0.9910
Epoch 95/200
 - 1s - loss: 0.0568 - acc: 0.9864 - val_loss: 0.0482 - val_acc: 0.9910
Epoch 96/200
 - 1s - loss: 0.0555 - acc: 0.9873 - val_loss: 0.0482 - val_acc: 0.9910
Epoch 97/200
 - 1s - loss: 0.0566 - acc: 0.9862 - val_loss: 0.0482 - val_acc: 0.9910
Epoch 98/200
 - 1s - loss: 0.0559 - acc: 0.9870 - val_loss: 0.0481 - val_acc: 0.9910
Epoch 99/200
 - 1s - loss: 0.0567 - acc: 0.9867 - val_loss: 0.0481 - val_acc: 0.9910
Epoch 100/200
 - 1s - loss: 0.0561 - acc: 0.9879 - val_loss: 0.0480 - val_acc: 0.9910
Epoch 101/200
 - 1s - loss: 0.0561 - acc: 0.9873 - val_loss: 0.0480 - val_acc: 0.9912
Epoch 102/200
 - 1s - loss: 0.0560 - acc: 0.9872 - val_loss: 0.0480 - val_acc: 0.9912
Epoch 103/200
 - 1s - loss: 0.0560 - acc: 0.9873 - val_loss: 0.0479 - val_acc: 0.9912
Epoch 104/200
 - 1s - loss: 0.0557 - acc: 0.9871 - val_loss: 0.0479 - val_acc: 0.9912
Epoch 105/200
 - 1s - loss: 0.0554 - acc: 0.9881 - val_loss: 0.0478 - val_acc: 0.9912
Epoch 106/200
 - 1s - loss: 0.0555 - acc: 0.9868 - val_loss: 0.0478 - val_acc: 0.9912
Epoch 107/200
 - 1s - loss: 0.0564 - acc: 0.9860 - val_loss: 0.0478 - val_acc: 0.9912
Epoch 108/200
 - 1s - loss: 0.0553 - acc: 0.9880 - val_loss: 0.0477 - val_acc: 0.9912
Epoch 109/200
 - 1s - loss: 0.0560 - acc: 0.9864 - val_loss: 0.0477 - val_acc: 0.9912
Epoch 110/200
 - 1s - loss: 0.0564 - acc: 0.9860 - val_loss: 0.0477 - val_acc: 0.9912
Epoch 111/200
 - 1s - loss: 0.0558 - acc: 0.9868 - val_loss: 0.0476 - val_acc: 0.9912
Epoch 112/200
 - 1s - loss: 0.0555 - acc: 0.9873 - val_loss: 0.0476 - val_acc: 0.9912
Epoch 113/200
 - 1s - loss: 0.0558 - acc: 0.9872 - val_loss: 0.0476 - val_acc: 0.9912
Epoch 114/200
 - 1s - loss: 0.0568 - acc: 0.9856 - val_loss: 0.0475 - val_acc: 0.9912
Epoch 115/200
 - 1s - loss: 0.0557 - acc: 0.9869 - val_loss: 0.0475 - val_acc: 0.9912
Epoch 116/200
 - 1s - loss: 0.0549 - acc: 0.9869 - val_loss: 0.0475 - val_acc: 0.9912
Epoch 117/200
 - 1s - loss: 0.0563 - acc: 0.9869 - val_loss: 0.0474 - val_acc: 0.9912
Epoch 118/200
 - 1s - loss: 0.0557 - acc: 0.9869 - val_loss: 0.0474 - val_acc: 0.9912
Epoch 119/200
 - 1s - loss: 0.0551 - acc: 0.9872 - val_loss: 0.0474 - val_acc: 0.9912
Epoch 120/200
 - 1s - loss: 0.0551 - acc: 0.9873 - val_loss: 0.0473 - val_acc: 0.9912
Epoch 121/200
 - 1s - loss: 0.0555 - acc: 0.9873 - val_loss: 0.0473 - val_acc: 0.9912
Epoch 122/200
 - 1s - loss: 0.0549 - acc: 0.9872 - val_loss: 0.0473 - val_acc: 0.9912
Epoch 123/200
 - 1s - loss: 0.0561 - acc: 0.9867 - val_loss: 0.0473 - val_acc: 0.9912
Epoch 124/200
 - 1s - loss: 0.0562 - acc: 0.9861 - val_loss: 0.0472 - val_acc: 0.9912
Epoch 125/200
 - 1s - loss: 0.0550 - acc: 0.9868 - val_loss: 0.0472 - val_acc: 0.9912
Epoch 126/200
 - 1s - loss: 0.0547 - acc: 0.9873 - val_loss: 0.0472 - val_acc: 0.9912
Epoch 127/200
 - 1s - loss: 0.0548 - acc: 0.9879 - val_loss: 0.0471 - val_acc: 0.9912
Epoch 128/200
 - 1s - loss: 0.0552 - acc: 0.9872 - val_loss: 0.0471 - val_acc: 0.9912
Epoch 129/200
 - 1s - loss: 0.0553 - acc: 0.9870 - val_loss: 0.0471 - val_acc: 0.9912
Epoch 130/200
 - 1s - loss: 0.0552 - acc: 0.9872 - val_loss: 0.0471 - val_acc: 0.9912
Epoch 131/200
 - 1s - loss: 0.0544 - acc: 0.9876 - val_loss: 0.0470 - val_acc: 0.9912
Epoch 132/200
 - 1s - loss: 0.0552 - acc: 0.9876 - val_loss: 0.0470 - val_acc: 0.9912
Epoch 133/200
 - 1s - loss: 0.0550 - acc: 0.9865 - val_loss: 0.0470 - val_acc: 0.9912
Epoch 134/200
 - 1s - loss: 0.0547 - acc: 0.9873 - val_loss: 0.0469 - val_acc: 0.9912
Epoch 135/200
 - 1s - loss: 0.0544 - acc: 0.9874 - val_loss: 0.0469 - val_acc: 0.9912
Epoch 136/200
 - 1s - loss: 0.0547 - acc: 0.9870 - val_loss: 0.0469 - val_acc: 0.9912
Epoch 137/200
 - 1s - loss: 0.0542 - acc: 0.9878 - val_loss: 0.0469 - val_acc: 0.9912
Epoch 138/200
 - 1s - loss: 0.0545 - acc: 0.9868 - val_loss: 0.0468 - val_acc: 0.9912
Epoch 139/200
 - 1s - loss: 0.0541 - acc: 0.9882 - val_loss: 0.0468 - val_acc: 0.9912
Epoch 140/200
 - 1s - loss: 0.0552 - acc: 0.9874 - val_loss: 0.0468 - val_acc: 0.9912
Epoch 141/200
 - 1s - loss: 0.0546 - acc: 0.9877 - val_loss: 0.0468 - val_acc: 0.9912
Epoch 142/200
 - 1s - loss: 0.0548 - acc: 0.9867 - val_loss: 0.0467 - val_acc: 0.9912
Epoch 143/200
 - 1s - loss: 0.0539 - acc: 0.9879 - val_loss: 0.0467 - val_acc: 0.9912
Epoch 144/200
 - 1s - loss: 0.0550 - acc: 0.9868 - val_loss: 0.0467 - val_acc: 0.9912
Epoch 145/200
 - 1s - loss: 0.0548 - acc: 0.9869 - val_loss: 0.0467 - val_acc: 0.9912
Epoch 146/200
 - 1s - loss: 0.0543 - acc: 0.9876 - val_loss: 0.0466 - val_acc: 0.9912
Epoch 147/200
 - 1s - loss: 0.0541 - acc: 0.9865 - val_loss: 0.0466 - val_acc: 0.9912
Epoch 148/200
 - 1s - loss: 0.0540 - acc: 0.9870 - val_loss: 0.0466 - val_acc: 0.9912
Epoch 149/200
 - 1s - loss: 0.0540 - acc: 0.9877 - val_loss: 0.0466 - val_acc: 0.9912
Epoch 150/200
 - 1s - loss: 0.0541 - acc: 0.9879 - val_loss: 0.0465 - val_acc: 0.9912
Epoch 151/200
 - 1s - loss: 0.0539 - acc: 0.9875 - val_loss: 0.0465 - val_acc: 0.9912
Epoch 152/200
 - 1s - loss: 0.0538 - acc: 0.9881 - val_loss: 0.0465 - val_acc: 0.9912
Epoch 153/200
 - 1s - loss: 0.0543 - acc: 0.9872 - val_loss: 0.0465 - val_acc: 0.9912
Epoch 154/200
 - 1s - loss: 0.0535 - acc: 0.9876 - val_loss: 0.0465 - val_acc: 0.9912
Epoch 155/200
 - 1s - loss: 0.0546 - acc: 0.9873 - val_loss: 0.0464 - val_acc: 0.9912
Epoch 156/200
 - 1s - loss: 0.0542 - acc: 0.9873 - val_loss: 0.0464 - val_acc: 0.9912
Epoch 157/200
 - 1s - loss: 0.0554 - acc: 0.9869 - val_loss: 0.0464 - val_acc: 0.9912
Epoch 158/200
 - 1s - loss: 0.0541 - acc: 0.9874 - val_loss: 0.0464 - val_acc: 0.9912
Epoch 159/200
 - 1s - loss: 0.0543 - acc: 0.9877 - val_loss: 0.0463 - val_acc: 0.9912
Epoch 160/200
 - 1s - loss: 0.0547 - acc: 0.9859 - val_loss: 0.0463 - val_acc: 0.9912
Epoch 161/200
 - 1s - loss: 0.0547 - acc: 0.9869 - val_loss: 0.0463 - val_acc: 0.9912
Epoch 162/200
 - 1s - loss: 0.0547 - acc: 0.9869 - val_loss: 0.0463 - val_acc: 0.9912
Epoch 163/200
 - 1s - loss: 0.0546 - acc: 0.9869 - val_loss: 0.0463 - val_acc: 0.9912
Epoch 164/200
 - 1s - loss: 0.0535 - acc: 0.9876 - val_loss: 0.0462 - val_acc: 0.9912
Epoch 165/200
 - 1s - loss: 0.0545 - acc: 0.9863 - val_loss: 0.0462 - val_acc: 0.9912
Epoch 166/200
 - 1s - loss: 0.0541 - acc: 0.9874 - val_loss: 0.0462 - val_acc: 0.9912
Epoch 167/200
 - 1s - loss: 0.0542 - acc: 0.9873 - val_loss: 0.0462 - val_acc: 0.9912
Epoch 168/200
 - 1s - loss: 0.0543 - acc: 0.9870 - val_loss: 0.0462 - val_acc: 0.9912
Epoch 169/200
 - 1s - loss: 0.0540 - acc: 0.9868 - val_loss: 0.0461 - val_acc: 0.9912
Epoch 170/200
 - 1s - loss: 0.0545 - acc: 0.9869 - val_loss: 0.0461 - val_acc: 0.9912
Epoch 171/200
 - 1s - loss: 0.0540 - acc: 0.9877 - val_loss: 0.0461 - val_acc: 0.9912
Epoch 172/200
 - 1s - loss: 0.0538 - acc: 0.9874 - val_loss: 0.0461 - val_acc: 0.9912
Epoch 173/200
 - 1s - loss: 0.0545 - acc: 0.9866 - val_loss: 0.0461 - val_acc: 0.9914
Epoch 174/200
 - 1s - loss: 0.0548 - acc: 0.9864 - val_loss: 0.0460 - val_acc: 0.9914
Epoch 175/200
 - 1s - loss: 0.0536 - acc: 0.9881 - val_loss: 0.0460 - val_acc: 0.9914
Epoch 176/200
 - 1s - loss: 0.0545 - acc: 0.9869 - val_loss: 0.0460 - val_acc: 0.9914
Epoch 177/200
 - 1s - loss: 0.0550 - acc: 0.9871 - val_loss: 0.0460 - val_acc: 0.9914
Epoch 178/200
 - 1s - loss: 0.0532 - acc: 0.9882 - val_loss: 0.0460 - val_acc: 0.9914
Epoch 179/200
 - 1s - loss: 0.0550 - acc: 0.9869 - val_loss: 0.0459 - val_acc: 0.9914
Epoch 180/200
 - 1s - loss: 0.0547 - acc: 0.9865 - val_loss: 0.0459 - val_acc: 0.9914
Epoch 181/200
 - 1s - loss: 0.0539 - acc: 0.9873 - val_loss: 0.0459 - val_acc: 0.9914
Epoch 182/200
 - 1s - loss: 0.0541 - acc: 0.9879 - val_loss: 0.0459 - val_acc: 0.9914
Epoch 183/200
 - 1s - loss: 0.0543 - acc: 0.9868 - val_loss: 0.0459 - val_acc: 0.9914
Epoch 184/200
 - 1s - loss: 0.0536 - acc: 0.9876 - val_loss: 0.0458 - val_acc: 0.9914
Epoch 185/200
 - 1s - loss: 0.0532 - acc: 0.9880 - val_loss: 0.0458 - val_acc: 0.9914
Epoch 186/200
 - 1s - loss: 0.0541 - acc: 0.9873 - val_loss: 0.0458 - val_acc: 0.9914
Epoch 187/200
 - 1s - loss: 0.0552 - acc: 0.9867 - val_loss: 0.0458 - val_acc: 0.9914
Epoch 188/200
 - 1s - loss: 0.0543 - acc: 0.9878 - val_loss: 0.0458 - val_acc: 0.9914
Epoch 189/200
 - 1s - loss: 0.0535 - acc: 0.9876 - val_loss: 0.0458 - val_acc: 0.9914
Epoch 190/200
 - 1s - loss: 0.0531 - acc: 0.9877 - val_loss: 0.0457 - val_acc: 0.9914
Epoch 191/200
 - 1s - loss: 0.0551 - acc: 0.9872 - val_loss: 0.0457 - val_acc: 0.9914
Epoch 192/200
 - 1s - loss: 0.0539 - acc: 0.9872 - val_loss: 0.0457 - val_acc: 0.9914
Epoch 193/200
 - 1s - loss: 0.0541 - acc: 0.9871 - val_loss: 0.0457 - val_acc: 0.9914
Epoch 194/200
 - 1s - loss: 0.0535 - acc: 0.9873 - val_loss: 0.0457 - val_acc: 0.9914
Epoch 195/200
 - 1s - loss: 0.0540 - acc: 0.9872 - val_loss: 0.0457 - val_acc: 0.9914
Epoch 196/200
 - 1s - loss: 0.0531 - acc: 0.9879 - val_loss: 0.0456 - val_acc: 0.9914
Epoch 197/200
 - 1s - loss: 0.0539 - acc: 0.9860 - val_loss: 0.0456 - val_acc: 0.9914
Epoch 198/200
 - 1s - loss: 0.0532 - acc: 0.9870 - val_loss: 0.0456 - val_acc: 0.9914
Epoch 199/200
 - 1s - loss: 0.0529 - acc: 0.9880 - val_loss: 0.0456 - val_acc: 0.9914
Epoch 200/200
 - 1s - loss: 0.0531 - acc: 0.9875 - val_loss: 0.0456 - val_acc: 0.9914
2018-03-27 09:29:38,389 [INFO] Evaluate...
2018-03-27 09:29:40,327 [INFO] Done!
2018-03-27 09:29:40,334 [INFO] tpe_transform took 0.002473 seconds
2018-03-27 09:29:40,334 [INFO] TPE using 14/14 trials with best loss 0.018868
2018-03-27 09:29:40,337 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:29:41,328 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.1504 - acc: 0.9632 - val_loss: 0.0824 - val_acc: 0.9872
Epoch 2/200
 - 1s - loss: 0.0835 - acc: 0.9824 - val_loss: 0.0698 - val_acc: 0.9886
Epoch 3/200
 - 1s - loss: 0.0742 - acc: 0.9837 - val_loss: 0.0644 - val_acc: 0.9898
Epoch 4/200
 - 1s - loss: 0.0701 - acc: 0.9849 - val_loss: 0.0611 - val_acc: 0.9904
Epoch 5/200
 - 1s - loss: 0.0670 - acc: 0.9858 - val_loss: 0.0589 - val_acc: 0.9906
Epoch 6/200
 - 1s - loss: 0.0651 - acc: 0.9850 - val_loss: 0.0572 - val_acc: 0.9908
Epoch 7/200
 - 1s - loss: 0.0635 - acc: 0.9859 - val_loss: 0.0558 - val_acc: 0.9910
Epoch 8/200
 - 1s - loss: 0.0628 - acc: 0.9859 - val_loss: 0.0548 - val_acc: 0.9912
Epoch 9/200
 - 1s - loss: 0.0614 - acc: 0.9859 - val_loss: 0.0539 - val_acc: 0.9912
Epoch 10/200
 - 1s - loss: 0.0610 - acc: 0.9859 - val_loss: 0.0531 - val_acc: 0.9912
Epoch 11/200
 - 1s - loss: 0.0607 - acc: 0.9856 - val_loss: 0.0524 - val_acc: 0.9912
Epoch 12/200
 - 1s - loss: 0.0591 - acc: 0.9869 - val_loss: 0.0518 - val_acc: 0.9912
Epoch 13/200
 - 1s - loss: 0.0583 - acc: 0.9867 - val_loss: 0.0513 - val_acc: 0.9912
Epoch 14/200
 - 1s - loss: 0.0581 - acc: 0.9869 - val_loss: 0.0508 - val_acc: 0.9914
Epoch 15/200
 - 1s - loss: 0.0576 - acc: 0.9876 - val_loss: 0.0503 - val_acc: 0.9914
Epoch 16/200
 - 1s - loss: 0.0578 - acc: 0.9868 - val_loss: 0.0499 - val_acc: 0.9914
Epoch 17/200
 - 1s - loss: 0.0569 - acc: 0.9874 - val_loss: 0.0496 - val_acc: 0.9914
Epoch 18/200
 - 1s - loss: 0.0573 - acc: 0.9863 - val_loss: 0.0492 - val_acc: 0.9914
Epoch 19/200
 - 1s - loss: 0.0565 - acc: 0.9869 - val_loss: 0.0489 - val_acc: 0.9914
Epoch 20/200
 - 1s - loss: 0.0558 - acc: 0.9873 - val_loss: 0.0486 - val_acc: 0.9918
Epoch 21/200
 - 1s - loss: 0.0564 - acc: 0.9870 - val_loss: 0.0484 - val_acc: 0.9920
Epoch 22/200
 - 1s - loss: 0.0555 - acc: 0.9871 - val_loss: 0.0481 - val_acc: 0.9920
Epoch 23/200
 - 1s - loss: 0.0554 - acc: 0.9872 - val_loss: 0.0479 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0555 - acc: 0.9870 - val_loss: 0.0476 - val_acc: 0.9920
Epoch 25/200
 - 1s - loss: 0.0547 - acc: 0.9872 - val_loss: 0.0474 - val_acc: 0.9920
Epoch 26/200
 - 1s - loss: 0.0545 - acc: 0.9869 - val_loss: 0.0472 - val_acc: 0.9922
Epoch 27/200
 - 1s - loss: 0.0544 - acc: 0.9872 - val_loss: 0.0470 - val_acc: 0.9922
Epoch 28/200
 - 1s - loss: 0.0545 - acc: 0.9868 - val_loss: 0.0468 - val_acc: 0.9924
Epoch 29/200
 - 1s - loss: 0.0539 - acc: 0.9876 - val_loss: 0.0467 - val_acc: 0.9924
Epoch 30/200
 - 1s - loss: 0.0536 - acc: 0.9871 - val_loss: 0.0465 - val_acc: 0.9924
Epoch 31/200
 - 1s - loss: 0.0532 - acc: 0.9877 - val_loss: 0.0463 - val_acc: 0.9924
Epoch 32/200
 - 1s - loss: 0.0526 - acc: 0.9884 - val_loss: 0.0462 - val_acc: 0.9924
Epoch 33/200
 - 1s - loss: 0.0531 - acc: 0.9871 - val_loss: 0.0460 - val_acc: 0.9924
Epoch 34/200
 - 1s - loss: 0.0529 - acc: 0.9871 - val_loss: 0.0459 - val_acc: 0.9924
Epoch 35/200
 - 1s - loss: 0.0537 - acc: 0.9878 - val_loss: 0.0457 - val_acc: 0.9924
Epoch 36/200
 - 1s - loss: 0.0526 - acc: 0.9879 - val_loss: 0.0456 - val_acc: 0.9926
Epoch 37/200
 - 1s - loss: 0.0529 - acc: 0.9871 - val_loss: 0.0455 - val_acc: 0.9926
Epoch 38/200
 - 1s - loss: 0.0534 - acc: 0.9878 - val_loss: 0.0453 - val_acc: 0.9926
Epoch 39/200
 - 1s - loss: 0.0530 - acc: 0.9873 - val_loss: 0.0452 - val_acc: 0.9926
Epoch 40/200
 - 1s - loss: 0.0524 - acc: 0.9875 - val_loss: 0.0451 - val_acc: 0.9926
Epoch 41/200
 - 1s - loss: 0.0521 - acc: 0.9874 - val_loss: 0.0450 - val_acc: 0.9926
Epoch 42/200
 - 1s - loss: 0.0521 - acc: 0.9877 - val_loss: 0.0449 - val_acc: 0.9926
Epoch 43/200
 - 1s - loss: 0.0514 - acc: 0.9881 - val_loss: 0.0448 - val_acc: 0.9926
Epoch 44/200
 - 1s - loss: 0.0515 - acc: 0.9882 - val_loss: 0.0447 - val_acc: 0.9926
Epoch 45/200
 - 1s - loss: 0.0519 - acc: 0.9882 - val_loss: 0.0446 - val_acc: 0.9926
Epoch 46/200
 - 1s - loss: 0.0519 - acc: 0.9876 - val_loss: 0.0445 - val_acc: 0.9926
Epoch 47/200
 - 1s - loss: 0.0515 - acc: 0.9884 - val_loss: 0.0444 - val_acc: 0.9926
Epoch 48/200
 - 1s - loss: 0.0521 - acc: 0.9877 - val_loss: 0.0443 - val_acc: 0.9926
Epoch 49/200
 - 1s - loss: 0.0515 - acc: 0.9884 - val_loss: 0.0442 - val_acc: 0.9926
Epoch 50/200
 - 1s - loss: 0.0513 - acc: 0.9882 - val_loss: 0.0441 - val_acc: 0.9926
Epoch 51/200
 - 1s - loss: 0.0514 - acc: 0.9878 - val_loss: 0.0440 - val_acc: 0.9926
Epoch 52/200
 - 1s - loss: 0.0512 - acc: 0.9882 - val_loss: 0.0440 - val_acc: 0.9926
Epoch 53/200
 - 1s - loss: 0.0507 - acc: 0.9876 - val_loss: 0.0439 - val_acc: 0.9926
Epoch 54/200
 - 1s - loss: 0.0508 - acc: 0.9879 - val_loss: 0.0438 - val_acc: 0.9926
Epoch 55/200
 - 1s - loss: 0.0513 - acc: 0.9877 - val_loss: 0.0437 - val_acc: 0.9926
Epoch 56/200
 - 1s - loss: 0.0506 - acc: 0.9884 - val_loss: 0.0437 - val_acc: 0.9926
Epoch 57/200
 - 1s - loss: 0.0511 - acc: 0.9882 - val_loss: 0.0436 - val_acc: 0.9926
Epoch 58/200
 - 1s - loss: 0.0515 - acc: 0.9878 - val_loss: 0.0435 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0510 - acc: 0.9878 - val_loss: 0.0434 - val_acc: 0.9928
Epoch 60/200
 - 1s - loss: 0.0507 - acc: 0.9876 - val_loss: 0.0434 - val_acc: 0.9928
Epoch 61/200
 - 1s - loss: 0.0509 - acc: 0.9878 - val_loss: 0.0433 - val_acc: 0.9928
Epoch 62/200
 - 1s - loss: 0.0506 - acc: 0.9872 - val_loss: 0.0432 - val_acc: 0.9928
Epoch 63/200
 - 1s - loss: 0.0497 - acc: 0.9883 - val_loss: 0.0432 - val_acc: 0.9928
Epoch 64/200
 - 1s - loss: 0.0499 - acc: 0.9877 - val_loss: 0.0431 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0498 - acc: 0.9887 - val_loss: 0.0431 - val_acc: 0.9928
Epoch 66/200
 - 1s - loss: 0.0502 - acc: 0.9880 - val_loss: 0.0430 - val_acc: 0.9928
Epoch 67/200
 - 1s - loss: 0.0505 - acc: 0.9878 - val_loss: 0.0429 - val_acc: 0.9930
Epoch 68/200
 - 1s - loss: 0.0502 - acc: 0.9883 - val_loss: 0.0429 - val_acc: 0.9930
Epoch 69/200
 - 1s - loss: 0.0500 - acc: 0.9884 - val_loss: 0.0428 - val_acc: 0.9930
Epoch 70/200
 - 1s - loss: 0.0502 - acc: 0.9879 - val_loss: 0.0428 - val_acc: 0.9930
Epoch 71/200
 - 1s - loss: 0.0498 - acc: 0.9884 - val_loss: 0.0427 - val_acc: 0.9930
Epoch 72/200
 - 1s - loss: 0.0502 - acc: 0.9874 - val_loss: 0.0427 - val_acc: 0.9930
Epoch 73/200
 - 1s - loss: 0.0495 - acc: 0.9884 - val_loss: 0.0426 - val_acc: 0.9930
Epoch 74/200
 - 1s - loss: 0.0496 - acc: 0.9881 - val_loss: 0.0426 - val_acc: 0.9930
Epoch 75/200
 - 1s - loss: 0.0496 - acc: 0.9881 - val_loss: 0.0425 - val_acc: 0.9930
Epoch 76/200
 - 1s - loss: 0.0497 - acc: 0.9882 - val_loss: 0.0425 - val_acc: 0.9930
Epoch 77/200
 - 1s - loss: 0.0498 - acc: 0.9880 - val_loss: 0.0424 - val_acc: 0.9930
Epoch 78/200
 - 1s - loss: 0.0496 - acc: 0.9879 - val_loss: 0.0424 - val_acc: 0.9930
Epoch 79/200
 - 1s - loss: 0.0493 - acc: 0.9878 - val_loss: 0.0423 - val_acc: 0.9930
Epoch 80/200
 - 1s - loss: 0.0493 - acc: 0.9886 - val_loss: 0.0423 - val_acc: 0.9930
Epoch 81/200
 - 1s - loss: 0.0494 - acc: 0.9876 - val_loss: 0.0422 - val_acc: 0.9930
Epoch 82/200
 - 1s - loss: 0.0494 - acc: 0.9880 - val_loss: 0.0422 - val_acc: 0.9930
Epoch 83/200
 - 1s - loss: 0.0489 - acc: 0.9884 - val_loss: 0.0421 - val_acc: 0.9930
Epoch 84/200
 - 1s - loss: 0.0492 - acc: 0.9885 - val_loss: 0.0421 - val_acc: 0.9930
Epoch 85/200
 - 1s - loss: 0.0485 - acc: 0.9882 - val_loss: 0.0420 - val_acc: 0.9930
Epoch 86/200
 - 1s - loss: 0.0494 - acc: 0.9885 - val_loss: 0.0420 - val_acc: 0.9930
Epoch 87/200
 - 1s - loss: 0.0484 - acc: 0.9893 - val_loss: 0.0419 - val_acc: 0.9930
Epoch 88/200
 - 1s - loss: 0.0489 - acc: 0.9880 - val_loss: 0.0419 - val_acc: 0.9930
Epoch 89/200
 - 1s - loss: 0.0495 - acc: 0.9879 - val_loss: 0.0419 - val_acc: 0.9930
Epoch 90/200
 - 1s - loss: 0.0488 - acc: 0.9881 - val_loss: 0.0418 - val_acc: 0.9930
Epoch 91/200
 - 1s - loss: 0.0490 - acc: 0.9887 - val_loss: 0.0418 - val_acc: 0.9930
Epoch 92/200
 - 1s - loss: 0.0482 - acc: 0.9882 - val_loss: 0.0417 - val_acc: 0.9930
Epoch 93/200
 - 1s - loss: 0.0490 - acc: 0.9886 - val_loss: 0.0417 - val_acc: 0.9930
Epoch 94/200
 - 1s - loss: 0.0486 - acc: 0.9882 - val_loss: 0.0417 - val_acc: 0.9930
Epoch 95/200
 - 1s - loss: 0.0490 - acc: 0.9876 - val_loss: 0.0416 - val_acc: 0.9930
Epoch 96/200
 - 1s - loss: 0.0488 - acc: 0.9880 - val_loss: 0.0416 - val_acc: 0.9930
Epoch 97/200
 - 1s - loss: 0.0486 - acc: 0.9882 - val_loss: 0.0416 - val_acc: 0.9930
Epoch 98/200
 - 1s - loss: 0.0484 - acc: 0.9888 - val_loss: 0.0415 - val_acc: 0.9930
Epoch 99/200
 - 1s - loss: 0.0487 - acc: 0.9873 - val_loss: 0.0415 - val_acc: 0.9930
Epoch 100/200
 - 1s - loss: 0.0489 - acc: 0.9878 - val_loss: 0.0414 - val_acc: 0.9930
Epoch 101/200
 - 1s - loss: 0.0487 - acc: 0.9882 - val_loss: 0.0414 - val_acc: 0.9930
Epoch 102/200
 - 1s - loss: 0.0491 - acc: 0.9881 - val_loss: 0.0414 - val_acc: 0.9930
Epoch 103/200
 - 1s - loss: 0.0487 - acc: 0.9884 - val_loss: 0.0413 - val_acc: 0.9930
Epoch 104/200
 - 1s - loss: 0.0484 - acc: 0.9887 - val_loss: 0.0413 - val_acc: 0.9930
Epoch 105/200
 - 1s - loss: 0.0480 - acc: 0.9882 - val_loss: 0.0413 - val_acc: 0.9930
Epoch 106/200
 - 1s - loss: 0.0488 - acc: 0.9877 - val_loss: 0.0412 - val_acc: 0.9930
Epoch 107/200
 - 1s - loss: 0.0483 - acc: 0.9883 - val_loss: 0.0412 - val_acc: 0.9930
Epoch 108/200
 - 1s - loss: 0.0481 - acc: 0.9881 - val_loss: 0.0412 - val_acc: 0.9930
Epoch 109/200
 - 1s - loss: 0.0481 - acc: 0.9882 - val_loss: 0.0411 - val_acc: 0.9930
Epoch 110/200
 - 1s - loss: 0.0480 - acc: 0.9880 - val_loss: 0.0411 - val_acc: 0.9930
Epoch 111/200
 - 1s - loss: 0.0485 - acc: 0.9883 - val_loss: 0.0411 - val_acc: 0.9930
Epoch 112/200
 - 1s - loss: 0.0480 - acc: 0.9887 - val_loss: 0.0411 - val_acc: 0.9930
Epoch 113/200
 - 1s - loss: 0.0480 - acc: 0.9883 - val_loss: 0.0410 - val_acc: 0.9930
Epoch 114/200
 - 1s - loss: 0.0481 - acc: 0.9888 - val_loss: 0.0410 - val_acc: 0.9930
Epoch 115/200
 - 1s - loss: 0.0480 - acc: 0.9887 - val_loss: 0.0410 - val_acc: 0.9930
Epoch 116/200
 - 1s - loss: 0.0480 - acc: 0.9888 - val_loss: 0.0409 - val_acc: 0.9930
Epoch 117/200
 - 1s - loss: 0.0483 - acc: 0.9876 - val_loss: 0.0409 - val_acc: 0.9930
Epoch 118/200
 - 1s - loss: 0.0483 - acc: 0.9881 - val_loss: 0.0409 - val_acc: 0.9930
Epoch 119/200
 - 1s - loss: 0.0479 - acc: 0.9878 - val_loss: 0.0409 - val_acc: 0.9930
Epoch 120/200
 - 1s - loss: 0.0478 - acc: 0.9882 - val_loss: 0.0408 - val_acc: 0.9930
Epoch 121/200
 - 1s - loss: 0.0476 - acc: 0.9886 - val_loss: 0.0408 - val_acc: 0.9930
Epoch 122/200
 - 1s - loss: 0.0477 - acc: 0.9893 - val_loss: 0.0408 - val_acc: 0.9930
Epoch 123/200
 - 1s - loss: 0.0479 - acc: 0.9888 - val_loss: 0.0407 - val_acc: 0.9930
Epoch 124/200
 - 1s - loss: 0.0475 - acc: 0.9888 - val_loss: 0.0407 - val_acc: 0.9930
Epoch 125/200
 - 1s - loss: 0.0477 - acc: 0.9884 - val_loss: 0.0407 - val_acc: 0.9930
Epoch 126/200
 - 1s - loss: 0.0481 - acc: 0.9884 - val_loss: 0.0407 - val_acc: 0.9930
Epoch 127/200
 - 1s - loss: 0.0477 - acc: 0.9892 - val_loss: 0.0406 - val_acc: 0.9930
Epoch 128/200
 - 1s - loss: 0.0478 - acc: 0.9873 - val_loss: 0.0406 - val_acc: 0.9930
Epoch 129/200
 - 1s - loss: 0.0479 - acc: 0.9882 - val_loss: 0.0406 - val_acc: 0.9930
Epoch 130/200
 - 1s - loss: 0.0477 - acc: 0.9890 - val_loss: 0.0406 - val_acc: 0.9930
Epoch 131/200
 - 1s - loss: 0.0480 - acc: 0.9881 - val_loss: 0.0405 - val_acc: 0.9930
Epoch 132/200
 - 1s - loss: 0.0475 - acc: 0.9890 - val_loss: 0.0405 - val_acc: 0.9930
Epoch 133/200
 - 1s - loss: 0.0476 - acc: 0.9883 - val_loss: 0.0405 - val_acc: 0.9930
Epoch 134/200
 - 1s - loss: 0.0471 - acc: 0.9890 - val_loss: 0.0405 - val_acc: 0.9930
Epoch 135/200
 - 1s - loss: 0.0476 - acc: 0.9880 - val_loss: 0.0404 - val_acc: 0.9930
Epoch 136/200
 - 1s - loss: 0.0482 - acc: 0.9881 - val_loss: 0.0404 - val_acc: 0.9930
Epoch 137/200
 - 1s - loss: 0.0473 - acc: 0.9888 - val_loss: 0.0404 - val_acc: 0.9930
Epoch 138/200
 - 1s - loss: 0.0474 - acc: 0.9888 - val_loss: 0.0404 - val_acc: 0.9930
Epoch 139/200
 - 1s - loss: 0.0474 - acc: 0.9883 - val_loss: 0.0403 - val_acc: 0.9930
Epoch 140/200
 - 1s - loss: 0.0479 - acc: 0.9882 - val_loss: 0.0403 - val_acc: 0.9930
Epoch 141/200
 - 1s - loss: 0.0482 - acc: 0.9883 - val_loss: 0.0403 - val_acc: 0.9930
Epoch 142/200
 - 1s - loss: 0.0466 - acc: 0.9890 - val_loss: 0.0403 - val_acc: 0.9930
Epoch 143/200
 - 1s - loss: 0.0478 - acc: 0.9886 - val_loss: 0.0402 - val_acc: 0.9930
Epoch 144/200
 - 1s - loss: 0.0475 - acc: 0.9878 - val_loss: 0.0402 - val_acc: 0.9930
Epoch 145/200
 - 1s - loss: 0.0472 - acc: 0.9891 - val_loss: 0.0402 - val_acc: 0.9930
Epoch 146/200
 - 1s - loss: 0.0477 - acc: 0.9885 - val_loss: 0.0402 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0478 - acc: 0.9884 - val_loss: 0.0402 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0476 - acc: 0.9885 - val_loss: 0.0401 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0475 - acc: 0.9883 - val_loss: 0.0401 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0472 - acc: 0.9884 - val_loss: 0.0401 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0471 - acc: 0.9888 - val_loss: 0.0401 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0471 - acc: 0.9891 - val_loss: 0.0401 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0473 - acc: 0.9887 - val_loss: 0.0400 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0476 - acc: 0.9882 - val_loss: 0.0400 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0469 - acc: 0.9879 - val_loss: 0.0400 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0471 - acc: 0.9882 - val_loss: 0.0400 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0469 - acc: 0.9888 - val_loss: 0.0400 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0469 - acc: 0.9891 - val_loss: 0.0399 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0469 - acc: 0.9890 - val_loss: 0.0399 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0470 - acc: 0.9888 - val_loss: 0.0399 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0469 - acc: 0.9889 - val_loss: 0.0399 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0465 - acc: 0.9892 - val_loss: 0.0399 - val_acc: 0.9932
Epoch 163/200
 - 1s - loss: 0.0472 - acc: 0.9880 - val_loss: 0.0398 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0467 - acc: 0.9892 - val_loss: 0.0398 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0466 - acc: 0.9888 - val_loss: 0.0398 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0469 - acc: 0.9884 - val_loss: 0.0398 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0467 - acc: 0.9890 - val_loss: 0.0398 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0461 - acc: 0.9890 - val_loss: 0.0397 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0459 - acc: 0.9890 - val_loss: 0.0397 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0465 - acc: 0.9888 - val_loss: 0.0397 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0473 - acc: 0.9882 - val_loss: 0.0397 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0468 - acc: 0.9888 - val_loss: 0.0397 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0473 - acc: 0.9883 - val_loss: 0.0396 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0469 - acc: 0.9889 - val_loss: 0.0396 - val_acc: 0.9932
Epoch 175/200
 - 1s - loss: 0.0467 - acc: 0.9887 - val_loss: 0.0396 - val_acc: 0.9932
Epoch 176/200
 - 1s - loss: 0.0463 - acc: 0.9882 - val_loss: 0.0396 - val_acc: 0.9932
Epoch 177/200
 - 1s - loss: 0.0461 - acc: 0.9888 - val_loss: 0.0396 - val_acc: 0.9932
Epoch 178/200
 - 1s - loss: 0.0467 - acc: 0.9888 - val_loss: 0.0396 - val_acc: 0.9932
Epoch 179/200
 - 1s - loss: 0.0465 - acc: 0.9887 - val_loss: 0.0395 - val_acc: 0.9932
Epoch 180/200
 - 1s - loss: 0.0465 - acc: 0.9880 - val_loss: 0.0395 - val_acc: 0.9932
Epoch 181/200
 - 1s - loss: 0.0468 - acc: 0.9886 - val_loss: 0.0395 - val_acc: 0.9932
Epoch 182/200
 - 1s - loss: 0.0466 - acc: 0.9882 - val_loss: 0.0395 - val_acc: 0.9932
Epoch 183/200
 - 1s - loss: 0.0460 - acc: 0.9890 - val_loss: 0.0395 - val_acc: 0.9932
Epoch 184/200
 - 1s - loss: 0.0464 - acc: 0.9887 - val_loss: 0.0395 - val_acc: 0.9932
Epoch 185/200
 - 1s - loss: 0.0469 - acc: 0.9889 - val_loss: 0.0394 - val_acc: 0.9932
Epoch 186/200
 - 1s - loss: 0.0465 - acc: 0.9885 - val_loss: 0.0394 - val_acc: 0.9932
Epoch 187/200
 - 1s - loss: 0.0472 - acc: 0.9883 - val_loss: 0.0394 - val_acc: 0.9932
Epoch 188/200
 - 1s - loss: 0.0457 - acc: 0.9892 - val_loss: 0.0394 - val_acc: 0.9932
Epoch 189/200
 - 1s - loss: 0.0463 - acc: 0.9887 - val_loss: 0.0394 - val_acc: 0.9932
Epoch 190/200
 - 1s - loss: 0.0466 - acc: 0.9886 - val_loss: 0.0394 - val_acc: 0.9932
Epoch 191/200
 - 1s - loss: 0.0466 - acc: 0.9886 - val_loss: 0.0393 - val_acc: 0.9932
Epoch 192/200
 - 1s - loss: 0.0465 - acc: 0.9887 - val_loss: 0.0393 - val_acc: 0.9932
Epoch 193/200
 - 1s - loss: 0.0471 - acc: 0.9887 - val_loss: 0.0393 - val_acc: 0.9932
Epoch 194/200
 - 1s - loss: 0.0466 - acc: 0.9887 - val_loss: 0.0393 - val_acc: 0.9934
Epoch 195/200
 - 1s - loss: 0.0465 - acc: 0.9887 - val_loss: 0.0393 - val_acc: 0.9934
Epoch 196/200
 - 1s - loss: 0.0470 - acc: 0.9889 - val_loss: 0.0393 - val_acc: 0.9934
Epoch 197/200
 - 1s - loss: 0.0466 - acc: 0.9884 - val_loss: 0.0393 - val_acc: 0.9934
Epoch 198/200
 - 1s - loss: 0.0466 - acc: 0.9886 - val_loss: 0.0392 - val_acc: 0.9934
Epoch 199/200
 - 1s - loss: 0.0456 - acc: 0.9888 - val_loss: 0.0392 - val_acc: 0.9934
Epoch 200/200
 - 1s - loss: 0.0459 - acc: 0.9888 - val_loss: 0.0392 - val_acc: 0.9934
2018-03-27 09:32:42,385 [INFO] Evaluate...
2018-03-27 09:32:44,364 [INFO] Done!
2018-03-27 09:32:44,371 [INFO] tpe_transform took 0.002451 seconds
2018-03-27 09:32:44,371 [INFO] TPE using 15/15 trials with best loss 0.018868
2018-03-27 09:32:44,374 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:32:45,364 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 3s - loss: 0.0817 - acc: 0.9718 - val_loss: 0.0422 - val_acc: 0.9888
Epoch 2/200
 - 1s - loss: 0.0416 - acc: 0.9878 - val_loss: 0.0393 - val_acc: 0.9890
Epoch 3/200
 - 1s - loss: 0.0395 - acc: 0.9886 - val_loss: 0.0379 - val_acc: 0.9890
Epoch 4/200
 - 1s - loss: 0.0381 - acc: 0.9888 - val_loss: 0.0369 - val_acc: 0.9888
Epoch 5/200
 - 1s - loss: 0.0368 - acc: 0.9893 - val_loss: 0.0364 - val_acc: 0.9888
Epoch 6/200
 - 1s - loss: 0.0361 - acc: 0.9901 - val_loss: 0.0359 - val_acc: 0.9886
Epoch 7/200
 - 1s - loss: 0.0349 - acc: 0.9899 - val_loss: 0.0355 - val_acc: 0.9886
Epoch 8/200
 - 1s - loss: 0.0354 - acc: 0.9896 - val_loss: 0.0352 - val_acc: 0.9886
Epoch 9/200
 - 1s - loss: 0.0349 - acc: 0.9902 - val_loss: 0.0349 - val_acc: 0.9886
Epoch 10/200
 - 1s - loss: 0.0352 - acc: 0.9900 - val_loss: 0.0346 - val_acc: 0.9886
Epoch 11/200
 - 1s - loss: 0.0337 - acc: 0.9910 - val_loss: 0.0344 - val_acc: 0.9888
Epoch 12/200
 - 1s - loss: 0.0349 - acc: 0.9901 - val_loss: 0.0343 - val_acc: 0.9888
Epoch 13/200
 - 1s - loss: 0.0334 - acc: 0.9911 - val_loss: 0.0341 - val_acc: 0.9888
Epoch 14/200
 - 1s - loss: 0.0333 - acc: 0.9905 - val_loss: 0.0340 - val_acc: 0.9888
Epoch 15/200
 - 1s - loss: 0.0335 - acc: 0.9899 - val_loss: 0.0339 - val_acc: 0.9888
Epoch 16/200
 - 1s - loss: 0.0328 - acc: 0.9905 - val_loss: 0.0337 - val_acc: 0.9888
Epoch 17/200
 - 1s - loss: 0.0332 - acc: 0.9910 - val_loss: 0.0336 - val_acc: 0.9888
Epoch 18/200
 - 1s - loss: 0.0332 - acc: 0.9906 - val_loss: 0.0335 - val_acc: 0.9888
Epoch 19/200
 - 1s - loss: 0.0328 - acc: 0.9906 - val_loss: 0.0334 - val_acc: 0.9888
Epoch 20/200
 - 1s - loss: 0.0322 - acc: 0.9910 - val_loss: 0.0333 - val_acc: 0.9888
Epoch 21/200
 - 1s - loss: 0.0325 - acc: 0.9908 - val_loss: 0.0332 - val_acc: 0.9888
Epoch 22/200
 - 1s - loss: 0.0327 - acc: 0.9905 - val_loss: 0.0332 - val_acc: 0.9888
Epoch 23/200
 - 1s - loss: 0.0323 - acc: 0.9909 - val_loss: 0.0331 - val_acc: 0.9888
Epoch 24/200
 - 1s - loss: 0.0325 - acc: 0.9907 - val_loss: 0.0330 - val_acc: 0.9888
Epoch 25/200
 - 1s - loss: 0.0317 - acc: 0.9913 - val_loss: 0.0329 - val_acc: 0.9886
Epoch 26/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0329 - val_acc: 0.9886
Epoch 27/200
 - 1s - loss: 0.0320 - acc: 0.9908 - val_loss: 0.0328 - val_acc: 0.9886
Epoch 28/200
 - 1s - loss: 0.0321 - acc: 0.9913 - val_loss: 0.0328 - val_acc: 0.9886
Epoch 29/200
 - 1s - loss: 0.0329 - acc: 0.9902 - val_loss: 0.0327 - val_acc: 0.9886
Epoch 30/200
 - 1s - loss: 0.0326 - acc: 0.9905 - val_loss: 0.0327 - val_acc: 0.9884
Epoch 31/200
 - 1s - loss: 0.0317 - acc: 0.9902 - val_loss: 0.0326 - val_acc: 0.9884
Epoch 32/200
 - 1s - loss: 0.0319 - acc: 0.9913 - val_loss: 0.0326 - val_acc: 0.9884
Epoch 33/200
 - 1s - loss: 0.0315 - acc: 0.9910 - val_loss: 0.0325 - val_acc: 0.9884
Epoch 34/200
 - 1s - loss: 0.0309 - acc: 0.9916 - val_loss: 0.0325 - val_acc: 0.9884
Epoch 35/200
 - 1s - loss: 0.0315 - acc: 0.9909 - val_loss: 0.0324 - val_acc: 0.9884
Epoch 36/200
 - 1s - loss: 0.0316 - acc: 0.9910 - val_loss: 0.0324 - val_acc: 0.9884
Epoch 37/200
 - 1s - loss: 0.0309 - acc: 0.9906 - val_loss: 0.0323 - val_acc: 0.9884
Epoch 38/200
 - 1s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0323 - val_acc: 0.9884
Epoch 39/200
 - 1s - loss: 0.0316 - acc: 0.9909 - val_loss: 0.0323 - val_acc: 0.9884
Epoch 40/200
 - 1s - loss: 0.0314 - acc: 0.9910 - val_loss: 0.0322 - val_acc: 0.9884
Epoch 41/200
 - 1s - loss: 0.0306 - acc: 0.9918 - val_loss: 0.0322 - val_acc: 0.9884
Epoch 42/200
 - 1s - loss: 0.0310 - acc: 0.9915 - val_loss: 0.0321 - val_acc: 0.9884
Epoch 43/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0321 - val_acc: 0.9884
Epoch 44/200
 - 1s - loss: 0.0315 - acc: 0.9904 - val_loss: 0.0321 - val_acc: 0.9884
Epoch 45/200
 - 1s - loss: 0.0312 - acc: 0.9906 - val_loss: 0.0320 - val_acc: 0.9884
Epoch 46/200
 - 1s - loss: 0.0300 - acc: 0.9915 - val_loss: 0.0320 - val_acc: 0.9884
Epoch 47/200
 - 1s - loss: 0.0312 - acc: 0.9908 - val_loss: 0.0320 - val_acc: 0.9884
Epoch 48/200
 - 1s - loss: 0.0312 - acc: 0.9909 - val_loss: 0.0320 - val_acc: 0.9884
Epoch 49/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0319 - val_acc: 0.9884
Epoch 50/200
 - 1s - loss: 0.0309 - acc: 0.9918 - val_loss: 0.0319 - val_acc: 0.9884
Epoch 51/200
 - 1s - loss: 0.0301 - acc: 0.9909 - val_loss: 0.0319 - val_acc: 0.9884
Epoch 52/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0318 - val_acc: 0.9884
Epoch 53/200
 - 1s - loss: 0.0303 - acc: 0.9919 - val_loss: 0.0318 - val_acc: 0.9884
Epoch 54/200
 - 1s - loss: 0.0303 - acc: 0.9914 - val_loss: 0.0318 - val_acc: 0.9884
Epoch 55/200
 - 1s - loss: 0.0309 - acc: 0.9906 - val_loss: 0.0318 - val_acc: 0.9884
Epoch 56/200
 - 1s - loss: 0.0301 - acc: 0.9919 - val_loss: 0.0318 - val_acc: 0.9884
Epoch 57/200
 - 1s - loss: 0.0307 - acc: 0.9905 - val_loss: 0.0317 - val_acc: 0.9884
Epoch 58/200
 - 1s - loss: 0.0302 - acc: 0.9913 - val_loss: 0.0317 - val_acc: 0.9884
Epoch 59/200
 - 1s - loss: 0.0310 - acc: 0.9911 - val_loss: 0.0317 - val_acc: 0.9886
Epoch 60/200
 - 1s - loss: 0.0310 - acc: 0.9908 - val_loss: 0.0317 - val_acc: 0.9886
Epoch 61/200
 - 1s - loss: 0.0306 - acc: 0.9909 - val_loss: 0.0316 - val_acc: 0.9886
Epoch 62/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0316 - val_acc: 0.9886
Epoch 63/200
 - 1s - loss: 0.0302 - acc: 0.9909 - val_loss: 0.0316 - val_acc: 0.9886
Epoch 64/200
 - 1s - loss: 0.0304 - acc: 0.9919 - val_loss: 0.0316 - val_acc: 0.9886
Epoch 65/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0316 - val_acc: 0.9886
Epoch 66/200
 - 1s - loss: 0.0309 - acc: 0.9913 - val_loss: 0.0315 - val_acc: 0.9886
Epoch 67/200
 - 1s - loss: 0.0297 - acc: 0.9908 - val_loss: 0.0315 - val_acc: 0.9886
Epoch 68/200
 - 1s - loss: 0.0304 - acc: 0.9917 - val_loss: 0.0315 - val_acc: 0.9886
Epoch 69/200
 - 1s - loss: 0.0303 - acc: 0.9913 - val_loss: 0.0315 - val_acc: 0.9886
Epoch 70/200
 - 1s - loss: 0.0296 - acc: 0.9920 - val_loss: 0.0315 - val_acc: 0.9886
Epoch 71/200
 - 1s - loss: 0.0301 - acc: 0.9918 - val_loss: 0.0315 - val_acc: 0.9886
Epoch 72/200
 - 1s - loss: 0.0308 - acc: 0.9915 - val_loss: 0.0314 - val_acc: 0.9886
Epoch 73/200
 - 1s - loss: 0.0304 - acc: 0.9911 - val_loss: 0.0314 - val_acc: 0.9886
Epoch 74/200
 - 1s - loss: 0.0297 - acc: 0.9918 - val_loss: 0.0314 - val_acc: 0.9886
Epoch 75/200
 - 1s - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0314 - val_acc: 0.9886
Epoch 76/200
 - 1s - loss: 0.0301 - acc: 0.9908 - val_loss: 0.0314 - val_acc: 0.9886
Epoch 77/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0314 - val_acc: 0.9886
Epoch 78/200
 - 1s - loss: 0.0299 - acc: 0.9918 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 79/200
 - 1s - loss: 0.0301 - acc: 0.9909 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 80/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 81/200
 - 1s - loss: 0.0293 - acc: 0.9921 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 82/200
 - 1s - loss: 0.0291 - acc: 0.9918 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 83/200
 - 1s - loss: 0.0304 - acc: 0.9906 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 84/200
 - 1s - loss: 0.0301 - acc: 0.9919 - val_loss: 0.0313 - val_acc: 0.9886
Epoch 85/200
 - 1s - loss: 0.0295 - acc: 0.9907 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 86/200
 - 1s - loss: 0.0299 - acc: 0.9910 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 87/200
 - 1s - loss: 0.0302 - acc: 0.9909 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 88/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 89/200
 - 1s - loss: 0.0301 - acc: 0.9909 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 90/200
 - 1s - loss: 0.0294 - acc: 0.9915 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 91/200
 - 1s - loss: 0.0299 - acc: 0.9914 - val_loss: 0.0312 - val_acc: 0.9886
Epoch 92/200
 - 1s - loss: 0.0301 - acc: 0.9916 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 93/200
 - 1s - loss: 0.0293 - acc: 0.9918 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 94/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 95/200
 - 1s - loss: 0.0292 - acc: 0.9913 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 96/200
 - 1s - loss: 0.0305 - acc: 0.9913 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 97/200
 - 1s - loss: 0.0296 - acc: 0.9910 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 98/200
 - 1s - loss: 0.0294 - acc: 0.9922 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 99/200
 - 1s - loss: 0.0302 - acc: 0.9909 - val_loss: 0.0311 - val_acc: 0.9886
Epoch 100/200
 - 1s - loss: 0.0297 - acc: 0.9917 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 101/200
 - 1s - loss: 0.0294 - acc: 0.9917 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 102/200
 - 1s - loss: 0.0287 - acc: 0.9924 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 103/200
 - 1s - loss: 0.0294 - acc: 0.9918 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 104/200
 - 1s - loss: 0.0301 - acc: 0.9917 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 105/200
 - 1s - loss: 0.0295 - acc: 0.9917 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 106/200
 - 1s - loss: 0.0292 - acc: 0.9911 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 107/200
 - 1s - loss: 0.0291 - acc: 0.9911 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 108/200
 - 1s - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0310 - val_acc: 0.9886
Epoch 109/200
 - 1s - loss: 0.0290 - acc: 0.9919 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 110/200
 - 1s - loss: 0.0294 - acc: 0.9916 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 111/200
 - 1s - loss: 0.0295 - acc: 0.9913 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 112/200
 - 1s - loss: 0.0291 - acc: 0.9920 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 113/200
 - 1s - loss: 0.0291 - acc: 0.9917 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 114/200
 - 1s - loss: 0.0293 - acc: 0.9911 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 115/200
 - 1s - loss: 0.0289 - acc: 0.9914 - val_loss: 0.0309 - val_acc: 0.9888
Epoch 116/200
 - 1s - loss: 0.0287 - acc: 0.9920 - val_loss: 0.0309 - val_acc: 0.9888
Epoch 117/200
 - 1s - loss: 0.0293 - acc: 0.9922 - val_loss: 0.0309 - val_acc: 0.9888
Epoch 118/200
 - 1s - loss: 0.0288 - acc: 0.9921 - val_loss: 0.0309 - val_acc: 0.9888
Epoch 119/200
 - 1s - loss: 0.0298 - acc: 0.9911 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 120/200
 - 1s - loss: 0.0293 - acc: 0.9912 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 121/200
 - 1s - loss: 0.0292 - acc: 0.9912 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 122/200
 - 1s - loss: 0.0292 - acc: 0.9920 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 123/200
 - 1s - loss: 0.0289 - acc: 0.9922 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 124/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 125/200
 - 1s - loss: 0.0295 - acc: 0.9919 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 126/200
 - 1s - loss: 0.0293 - acc: 0.9917 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 127/200
 - 1s - loss: 0.0290 - acc: 0.9916 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 128/200
 - 1s - loss: 0.0288 - acc: 0.9916 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 129/200
 - 1s - loss: 0.0294 - acc: 0.9918 - val_loss: 0.0308 - val_acc: 0.9888
Epoch 130/200
 - 1s - loss: 0.0287 - acc: 0.9916 - val_loss: 0.0307 - val_acc: 0.9888
Epoch 131/200
 - 1s - loss: 0.0294 - acc: 0.9914 - val_loss: 0.0307 - val_acc: 0.9888
Epoch 132/200
 - 1s - loss: 0.0292 - acc: 0.9918 - val_loss: 0.0307 - val_acc: 0.9888
Epoch 133/200
 - 1s - loss: 0.0287 - acc: 0.9925 - val_loss: 0.0307 - val_acc: 0.9888
Epoch 134/200
 - 1s - loss: 0.0295 - acc: 0.9915 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 135/200
 - 1s - loss: 0.0294 - acc: 0.9913 - val_loss: 0.0307 - val_acc: 0.9888
Epoch 136/200
 - 1s - loss: 0.0290 - acc: 0.9918 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 137/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 138/200
 - 1s - loss: 0.0291 - acc: 0.9918 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 139/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 140/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 141/200
 - 1s - loss: 0.0290 - acc: 0.9917 - val_loss: 0.0307 - val_acc: 0.9890
Epoch 142/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 143/200
 - 1s - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 144/200
 - 1s - loss: 0.0289 - acc: 0.9920 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 145/200
 - 1s - loss: 0.0293 - acc: 0.9913 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 146/200
 - 1s - loss: 0.0289 - acc: 0.9915 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 147/200
 - 1s - loss: 0.0293 - acc: 0.9916 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 148/200
 - 1s - loss: 0.0294 - acc: 0.9918 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 149/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 150/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 151/200
 - 1s - loss: 0.0285 - acc: 0.9919 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 152/200
 - 1s - loss: 0.0286 - acc: 0.9918 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 153/200
 - 1s - loss: 0.0291 - acc: 0.9918 - val_loss: 0.0306 - val_acc: 0.9892
Epoch 154/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 155/200
 - 1s - loss: 0.0281 - acc: 0.9920 - val_loss: 0.0306 - val_acc: 0.9890
Epoch 156/200
 - 1s - loss: 0.0297 - acc: 0.9909 - val_loss: 0.0306 - val_acc: 0.9892
Epoch 157/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 158/200
 - 1s - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 159/200
 - 1s - loss: 0.0284 - acc: 0.9920 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 160/200
 - 1s - loss: 0.0292 - acc: 0.9910 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 161/200
 - 1s - loss: 0.0295 - acc: 0.9923 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 162/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 163/200
 - 1s - loss: 0.0289 - acc: 0.9911 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 164/200
 - 1s - loss: 0.0281 - acc: 0.9924 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 165/200
 - 1s - loss: 0.0286 - acc: 0.9918 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 166/200
 - 1s - loss: 0.0286 - acc: 0.9920 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 167/200
 - 1s - loss: 0.0287 - acc: 0.9916 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 168/200
 - 1s - loss: 0.0282 - acc: 0.9922 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 169/200
 - 1s - loss: 0.0297 - acc: 0.9910 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 170/200
 - 1s - loss: 0.0287 - acc: 0.9915 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 171/200
 - 1s - loss: 0.0285 - acc: 0.9924 - val_loss: 0.0305 - val_acc: 0.9892
Epoch 172/200
 - 1s - loss: 0.0283 - acc: 0.9914 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 173/200
 - 1s - loss: 0.0285 - acc: 0.9919 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 174/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 175/200
 - 1s - loss: 0.0288 - acc: 0.9918 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 176/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 177/200
 - 1s - loss: 0.0295 - acc: 0.9919 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 178/200
 - 1s - loss: 0.0287 - acc: 0.9917 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 179/200
 - 1s - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 180/200
 - 1s - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 181/200
 - 1s - loss: 0.0286 - acc: 0.9915 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 182/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 183/200
 - 1s - loss: 0.0288 - acc: 0.9910 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 184/200
 - 1s - loss: 0.0281 - acc: 0.9923 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 185/200
 - 1s - loss: 0.0280 - acc: 0.9922 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 186/200
 - 1s - loss: 0.0289 - acc: 0.9913 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 187/200
 - 1s - loss: 0.0294 - acc: 0.9917 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 188/200
 - 1s - loss: 0.0289 - acc: 0.9911 - val_loss: 0.0304 - val_acc: 0.9892
Epoch 189/200
 - 1s - loss: 0.0289 - acc: 0.9919 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 190/200
 - 1s - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 191/200
 - 1s - loss: 0.0287 - acc: 0.9918 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 192/200
 - 1s - loss: 0.0281 - acc: 0.9920 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 193/200
 - 1s - loss: 0.0284 - acc: 0.9917 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 194/200
 - 1s - loss: 0.0285 - acc: 0.9915 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 195/200
 - 1s - loss: 0.0295 - acc: 0.9914 - val_loss: 0.0303 - val_acc: 0.9892
Epoch 196/200
 - 1s - loss: 0.0288 - acc: 0.9916 - val_loss: 0.0303 - val_acc: 0.9890
Epoch 197/200
 - 1s - loss: 0.0284 - acc: 0.9914 - val_loss: 0.0303 - val_acc: 0.9890
Epoch 198/200
 - 1s - loss: 0.0288 - acc: 0.9918 - val_loss: 0.0303 - val_acc: 0.9890
Epoch 199/200
 - 1s - loss: 0.0284 - acc: 0.9917 - val_loss: 0.0303 - val_acc: 0.9890
Epoch 200/200
 - 1s - loss: 0.0284 - acc: 0.9915 - val_loss: 0.0303 - val_acc: 0.9890
2018-03-27 09:35:46,673 [INFO] Evaluate...
2018-03-27 09:35:48,707 [INFO] Done!
2018-03-27 09:35:48,714 [INFO] tpe_transform took 0.003213 seconds
2018-03-27 09:35:48,715 [INFO] TPE using 16/16 trials with best loss 0.018868
2018-03-27 09:35:48,718 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:35:49,704 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0933 - acc: 0.9739 - val_loss: 0.0494 - val_acc: 0.9908
Epoch 2/200
 - 1s - loss: 0.0523 - acc: 0.9879 - val_loss: 0.0439 - val_acc: 0.9914
Epoch 3/200
 - 1s - loss: 0.0485 - acc: 0.9884 - val_loss: 0.0415 - val_acc: 0.9922
Epoch 4/200
 - 1s - loss: 0.0462 - acc: 0.9891 - val_loss: 0.0399 - val_acc: 0.9922
Epoch 5/200
 - 1s - loss: 0.0449 - acc: 0.9894 - val_loss: 0.0389 - val_acc: 0.9922
Epoch 6/200
 - 1s - loss: 0.0436 - acc: 0.9890 - val_loss: 0.0381 - val_acc: 0.9920
Epoch 7/200
 - 1s - loss: 0.0430 - acc: 0.9897 - val_loss: 0.0374 - val_acc: 0.9922
Epoch 8/200
 - 1s - loss: 0.0420 - acc: 0.9896 - val_loss: 0.0369 - val_acc: 0.9924
Epoch 9/200
 - 1s - loss: 0.0422 - acc: 0.9890 - val_loss: 0.0365 - val_acc: 0.9924
Epoch 10/200
 - 1s - loss: 0.0409 - acc: 0.9900 - val_loss: 0.0361 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0407 - acc: 0.9900 - val_loss: 0.0358 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0403 - acc: 0.9902 - val_loss: 0.0355 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0405 - acc: 0.9904 - val_loss: 0.0352 - val_acc: 0.9924
Epoch 14/200
 - 1s - loss: 0.0402 - acc: 0.9897 - val_loss: 0.0350 - val_acc: 0.9924
Epoch 15/200
 - 1s - loss: 0.0397 - acc: 0.9906 - val_loss: 0.0348 - val_acc: 0.9924
Epoch 16/200
 - 1s - loss: 0.0390 - acc: 0.9906 - val_loss: 0.0346 - val_acc: 0.9924
Epoch 17/200
 - 1s - loss: 0.0392 - acc: 0.9909 - val_loss: 0.0344 - val_acc: 0.9924
Epoch 18/200
 - 1s - loss: 0.0386 - acc: 0.9905 - val_loss: 0.0342 - val_acc: 0.9924
Epoch 19/200
 - 1s - loss: 0.0386 - acc: 0.9908 - val_loss: 0.0341 - val_acc: 0.9924
Epoch 20/200
 - 1s - loss: 0.0388 - acc: 0.9903 - val_loss: 0.0340 - val_acc: 0.9924
Epoch 21/200
 - 1s - loss: 0.0388 - acc: 0.9904 - val_loss: 0.0338 - val_acc: 0.9924
Epoch 22/200
 - 1s - loss: 0.0382 - acc: 0.9911 - val_loss: 0.0337 - val_acc: 0.9924
Epoch 23/200
 - 1s - loss: 0.0379 - acc: 0.9906 - val_loss: 0.0336 - val_acc: 0.9924
Epoch 24/200
 - 1s - loss: 0.0376 - acc: 0.9906 - val_loss: 0.0335 - val_acc: 0.9926
Epoch 25/200
 - 1s - loss: 0.0378 - acc: 0.9905 - val_loss: 0.0334 - val_acc: 0.9928
Epoch 26/200
 - 1s - loss: 0.0374 - acc: 0.9905 - val_loss: 0.0332 - val_acc: 0.9928
Epoch 27/200
 - 1s - loss: 0.0380 - acc: 0.9905 - val_loss: 0.0332 - val_acc: 0.9928
Epoch 28/200
 - 1s - loss: 0.0379 - acc: 0.9904 - val_loss: 0.0331 - val_acc: 0.9928
Epoch 29/200
 - 1s - loss: 0.0380 - acc: 0.9903 - val_loss: 0.0330 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0377 - acc: 0.9905 - val_loss: 0.0329 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0373 - acc: 0.9912 - val_loss: 0.0328 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0371 - acc: 0.9913 - val_loss: 0.0327 - val_acc: 0.9930
Epoch 33/200
 - 1s - loss: 0.0371 - acc: 0.9906 - val_loss: 0.0327 - val_acc: 0.9930
Epoch 34/200
 - 1s - loss: 0.0372 - acc: 0.9905 - val_loss: 0.0326 - val_acc: 0.9930
Epoch 35/200
 - 1s - loss: 0.0377 - acc: 0.9905 - val_loss: 0.0325 - val_acc: 0.9930
Epoch 36/200
 - 1s - loss: 0.0366 - acc: 0.9902 - val_loss: 0.0324 - val_acc: 0.9930
Epoch 37/200
 - 1s - loss: 0.0369 - acc: 0.9908 - val_loss: 0.0324 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0370 - acc: 0.9903 - val_loss: 0.0323 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0363 - acc: 0.9909 - val_loss: 0.0323 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0365 - acc: 0.9911 - val_loss: 0.0322 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0365 - acc: 0.9909 - val_loss: 0.0321 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0368 - acc: 0.9912 - val_loss: 0.0321 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0360 - acc: 0.9911 - val_loss: 0.0320 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0371 - acc: 0.9906 - val_loss: 0.0320 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0365 - acc: 0.9908 - val_loss: 0.0319 - val_acc: 0.9932
Epoch 46/200
 - 1s - loss: 0.0368 - acc: 0.9909 - val_loss: 0.0319 - val_acc: 0.9932
Epoch 47/200
 - 1s - loss: 0.0366 - acc: 0.9905 - val_loss: 0.0318 - val_acc: 0.9932
Epoch 48/200
 - 1s - loss: 0.0364 - acc: 0.9909 - val_loss: 0.0318 - val_acc: 0.9932
Epoch 49/200
 - 1s - loss: 0.0364 - acc: 0.9909 - val_loss: 0.0317 - val_acc: 0.9932
Epoch 50/200
 - 1s - loss: 0.0362 - acc: 0.9910 - val_loss: 0.0317 - val_acc: 0.9932
Epoch 51/200
 - 1s - loss: 0.0360 - acc: 0.9914 - val_loss: 0.0317 - val_acc: 0.9932
Epoch 52/200
 - 1s - loss: 0.0359 - acc: 0.9909 - val_loss: 0.0316 - val_acc: 0.9932
Epoch 53/200
 - 1s - loss: 0.0357 - acc: 0.9910 - val_loss: 0.0316 - val_acc: 0.9932
Epoch 54/200
 - 1s - loss: 0.0363 - acc: 0.9906 - val_loss: 0.0315 - val_acc: 0.9932
Epoch 55/200
 - 1s - loss: 0.0360 - acc: 0.9911 - val_loss: 0.0315 - val_acc: 0.9932
Epoch 56/200
 - 1s - loss: 0.0352 - acc: 0.9919 - val_loss: 0.0315 - val_acc: 0.9934
Epoch 57/200
 - 1s - loss: 0.0356 - acc: 0.9908 - val_loss: 0.0314 - val_acc: 0.9934
Epoch 58/200
 - 1s - loss: 0.0356 - acc: 0.9910 - val_loss: 0.0314 - val_acc: 0.9934
Epoch 59/200
 - 1s - loss: 0.0358 - acc: 0.9908 - val_loss: 0.0314 - val_acc: 0.9934
Epoch 60/200
 - 1s - loss: 0.0362 - acc: 0.9913 - val_loss: 0.0313 - val_acc: 0.9934
Epoch 61/200
 - 1s - loss: 0.0355 - acc: 0.9914 - val_loss: 0.0313 - val_acc: 0.9934
Epoch 62/200
 - 1s - loss: 0.0355 - acc: 0.9909 - val_loss: 0.0313 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0354 - acc: 0.9911 - val_loss: 0.0312 - val_acc: 0.9934
Epoch 64/200
 - 1s - loss: 0.0355 - acc: 0.9909 - val_loss: 0.0312 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0356 - acc: 0.9913 - val_loss: 0.0312 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0354 - acc: 0.9910 - val_loss: 0.0311 - val_acc: 0.9934
Epoch 67/200
 - 1s - loss: 0.0355 - acc: 0.9904 - val_loss: 0.0311 - val_acc: 0.9934
Epoch 68/200
 - 1s - loss: 0.0354 - acc: 0.9913 - val_loss: 0.0311 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0356 - acc: 0.9911 - val_loss: 0.0310 - val_acc: 0.9934
Epoch 70/200
 - 1s - loss: 0.0354 - acc: 0.9906 - val_loss: 0.0310 - val_acc: 0.9934
Epoch 71/200
 - 1s - loss: 0.0350 - acc: 0.9913 - val_loss: 0.0310 - val_acc: 0.9934
Epoch 72/200
 - 1s - loss: 0.0352 - acc: 0.9910 - val_loss: 0.0310 - val_acc: 0.9934
Epoch 73/200
 - 1s - loss: 0.0351 - acc: 0.9916 - val_loss: 0.0309 - val_acc: 0.9934
Epoch 74/200
 - 1s - loss: 0.0355 - acc: 0.9907 - val_loss: 0.0309 - val_acc: 0.9934
Epoch 75/200
 - 1s - loss: 0.0353 - acc: 0.9908 - val_loss: 0.0309 - val_acc: 0.9934
Epoch 76/200
 - 1s - loss: 0.0349 - acc: 0.9910 - val_loss: 0.0308 - val_acc: 0.9934
Epoch 77/200
 - 1s - loss: 0.0346 - acc: 0.9913 - val_loss: 0.0308 - val_acc: 0.9934
Epoch 78/200
 - 1s - loss: 0.0354 - acc: 0.9908 - val_loss: 0.0308 - val_acc: 0.9934
Epoch 79/200
 - 1s - loss: 0.0353 - acc: 0.9913 - val_loss: 0.0308 - val_acc: 0.9934
Epoch 80/200
 - 1s - loss: 0.0352 - acc: 0.9915 - val_loss: 0.0307 - val_acc: 0.9934
Epoch 81/200
 - 1s - loss: 0.0354 - acc: 0.9910 - val_loss: 0.0307 - val_acc: 0.9934
Epoch 82/200
 - 1s - loss: 0.0347 - acc: 0.9914 - val_loss: 0.0307 - val_acc: 0.9934
Epoch 83/200
 - 1s - loss: 0.0349 - acc: 0.9913 - val_loss: 0.0307 - val_acc: 0.9934
Epoch 84/200
 - 1s - loss: 0.0352 - acc: 0.9914 - val_loss: 0.0307 - val_acc: 0.9934
Epoch 85/200
 - 1s - loss: 0.0345 - acc: 0.9916 - val_loss: 0.0306 - val_acc: 0.9934
Epoch 86/200
 - 1s - loss: 0.0349 - acc: 0.9906 - val_loss: 0.0306 - val_acc: 0.9934
Epoch 87/200
 - 1s - loss: 0.0345 - acc: 0.9914 - val_loss: 0.0306 - val_acc: 0.9934
Epoch 88/200
 - 1s - loss: 0.0349 - acc: 0.9904 - val_loss: 0.0306 - val_acc: 0.9934
Epoch 89/200
 - 1s - loss: 0.0346 - acc: 0.9912 - val_loss: 0.0305 - val_acc: 0.9934
Epoch 90/200
 - 1s - loss: 0.0350 - acc: 0.9912 - val_loss: 0.0305 - val_acc: 0.9934
Epoch 91/200
 - 1s - loss: 0.0350 - acc: 0.9907 - val_loss: 0.0305 - val_acc: 0.9934
Epoch 92/200
 - 1s - loss: 0.0349 - acc: 0.9911 - val_loss: 0.0305 - val_acc: 0.9934
Epoch 93/200
 - 1s - loss: 0.0350 - acc: 0.9909 - val_loss: 0.0305 - val_acc: 0.9934
Epoch 94/200
 - 1s - loss: 0.0351 - acc: 0.9910 - val_loss: 0.0304 - val_acc: 0.9934
Epoch 95/200
 - 1s - loss: 0.0350 - acc: 0.9913 - val_loss: 0.0304 - val_acc: 0.9934
Epoch 96/200
 - 1s - loss: 0.0344 - acc: 0.9914 - val_loss: 0.0304 - val_acc: 0.9934
Epoch 97/200
 - 1s - loss: 0.0345 - acc: 0.9914 - val_loss: 0.0304 - val_acc: 0.9934
Epoch 98/200
 - 1s - loss: 0.0344 - acc: 0.9916 - val_loss: 0.0304 - val_acc: 0.9934
Epoch 99/200
 - 1s - loss: 0.0346 - acc: 0.9913 - val_loss: 0.0304 - val_acc: 0.9934
Epoch 100/200
 - 1s - loss: 0.0345 - acc: 0.9906 - val_loss: 0.0303 - val_acc: 0.9934
Epoch 101/200
 - 1s - loss: 0.0345 - acc: 0.9915 - val_loss: 0.0303 - val_acc: 0.9934
Epoch 102/200
 - 1s - loss: 0.0344 - acc: 0.9908 - val_loss: 0.0303 - val_acc: 0.9934
Epoch 103/200
 - 1s - loss: 0.0342 - acc: 0.9914 - val_loss: 0.0303 - val_acc: 0.9934
Epoch 104/200
 - 1s - loss: 0.0343 - acc: 0.9915 - val_loss: 0.0303 - val_acc: 0.9934
Epoch 105/200
 - 1s - loss: 0.0348 - acc: 0.9905 - val_loss: 0.0302 - val_acc: 0.9934
Epoch 106/200
 - 1s - loss: 0.0342 - acc: 0.9913 - val_loss: 0.0302 - val_acc: 0.9934
Epoch 107/200
 - 1s - loss: 0.0346 - acc: 0.9910 - val_loss: 0.0302 - val_acc: 0.9934
Epoch 108/200
 - 1s - loss: 0.0342 - acc: 0.9914 - val_loss: 0.0302 - val_acc: 0.9934
Epoch 109/200
 - 1s - loss: 0.0345 - acc: 0.9912 - val_loss: 0.0302 - val_acc: 0.9934
Epoch 110/200
 - 1s - loss: 0.0347 - acc: 0.9914 - val_loss: 0.0302 - val_acc: 0.9934
Epoch 111/200
 - 1s - loss: 0.0344 - acc: 0.9909 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 112/200
 - 1s - loss: 0.0346 - acc: 0.9905 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 113/200
 - 1s - loss: 0.0344 - acc: 0.9910 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 114/200
 - 1s - loss: 0.0344 - acc: 0.9914 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 115/200
 - 1s - loss: 0.0337 - acc: 0.9918 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 116/200
 - 1s - loss: 0.0346 - acc: 0.9909 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 117/200
 - 1s - loss: 0.0344 - acc: 0.9918 - val_loss: 0.0301 - val_acc: 0.9934
Epoch 118/200
 - 1s - loss: 0.0342 - acc: 0.9914 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 119/200
 - 1s - loss: 0.0343 - acc: 0.9910 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 120/200
 - 1s - loss: 0.0345 - acc: 0.9914 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 121/200
 - 1s - loss: 0.0338 - acc: 0.9916 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 122/200
 - 1s - loss: 0.0337 - acc: 0.9917 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 123/200
 - 1s - loss: 0.0336 - acc: 0.9914 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 124/200
 - 1s - loss: 0.0340 - acc: 0.9915 - val_loss: 0.0300 - val_acc: 0.9934
Epoch 125/200
 - 1s - loss: 0.0339 - acc: 0.9912 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 126/200
 - 1s - loss: 0.0341 - acc: 0.9911 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 127/200
 - 1s - loss: 0.0339 - acc: 0.9911 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 128/200
 - 1s - loss: 0.0337 - acc: 0.9917 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 129/200
 - 1s - loss: 0.0343 - acc: 0.9908 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 130/200
 - 1s - loss: 0.0333 - acc: 0.9920 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 131/200
 - 1s - loss: 0.0337 - acc: 0.9913 - val_loss: 0.0299 - val_acc: 0.9934
Epoch 132/200
 - 1s - loss: 0.0343 - acc: 0.9914 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 133/200
 - 1s - loss: 0.0341 - acc: 0.9918 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 134/200
 - 1s - loss: 0.0337 - acc: 0.9917 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 135/200
 - 1s - loss: 0.0338 - acc: 0.9918 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 136/200
 - 1s - loss: 0.0341 - acc: 0.9909 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 137/200
 - 1s - loss: 0.0339 - acc: 0.9913 - val_loss: 0.0298 - val_acc: 0.9934
Epoch 138/200
 - 1s - loss: 0.0338 - acc: 0.9914 - val_loss: 0.0298 - val_acc: 0.9932
Epoch 139/200
 - 1s - loss: 0.0339 - acc: 0.9910 - val_loss: 0.0298 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0339 - acc: 0.9914 - val_loss: 0.0298 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0339 - acc: 0.9911 - val_loss: 0.0297 - val_acc: 0.9934
Epoch 142/200
 - 1s - loss: 0.0339 - acc: 0.9915 - val_loss: 0.0297 - val_acc: 0.9934
Epoch 143/200
 - 1s - loss: 0.0334 - acc: 0.9919 - val_loss: 0.0297 - val_acc: 0.9934
Epoch 144/200
 - 1s - loss: 0.0339 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9934
Epoch 145/200
 - 1s - loss: 0.0341 - acc: 0.9913 - val_loss: 0.0297 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0339 - acc: 0.9915 - val_loss: 0.0297 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0339 - acc: 0.9914 - val_loss: 0.0297 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0337 - acc: 0.9915 - val_loss: 0.0297 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0333 - acc: 0.9913 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0341 - acc: 0.9910 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0335 - acc: 0.9912 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0336 - acc: 0.9909 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0338 - acc: 0.9913 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0339 - acc: 0.9914 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0336 - acc: 0.9914 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0340 - acc: 0.9914 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0337 - acc: 0.9917 - val_loss: 0.0296 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0336 - acc: 0.9913 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0335 - acc: 0.9908 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0338 - acc: 0.9914 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0334 - acc: 0.9917 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0341 - acc: 0.9915 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 163/200
 - 1s - loss: 0.0338 - acc: 0.9915 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0331 - acc: 0.9917 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0335 - acc: 0.9911 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0337 - acc: 0.9918 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0336 - acc: 0.9921 - val_loss: 0.0295 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0333 - acc: 0.9915 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0330 - acc: 0.9913 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0333 - acc: 0.9910 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0334 - acc: 0.9914 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0333 - acc: 0.9915 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0340 - acc: 0.9906 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0334 - acc: 0.9913 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 175/200
 - 1s - loss: 0.0333 - acc: 0.9914 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 176/200
 - 1s - loss: 0.0332 - acc: 0.9914 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 177/200
 - 1s - loss: 0.0334 - acc: 0.9909 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 178/200
 - 1s - loss: 0.0331 - acc: 0.9913 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 179/200
 - 1s - loss: 0.0329 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9932
Epoch 180/200
 - 1s - loss: 0.0336 - acc: 0.9914 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 181/200
 - 1s - loss: 0.0335 - acc: 0.9914 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 182/200
 - 1s - loss: 0.0332 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 183/200
 - 1s - loss: 0.0333 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 184/200
 - 1s - loss: 0.0333 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 185/200
 - 1s - loss: 0.0335 - acc: 0.9913 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 186/200
 - 1s - loss: 0.0331 - acc: 0.9914 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 187/200
 - 1s - loss: 0.0332 - acc: 0.9913 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 188/200
 - 1s - loss: 0.0334 - acc: 0.9917 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 189/200
 - 1s - loss: 0.0330 - acc: 0.9918 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 190/200
 - 1s - loss: 0.0331 - acc: 0.9913 - val_loss: 0.0293 - val_acc: 0.9932
Epoch 191/200
 - 1s - loss: 0.0327 - acc: 0.9923 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 192/200
 - 1s - loss: 0.0336 - acc: 0.9911 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 193/200
 - 1s - loss: 0.0333 - acc: 0.9915 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 194/200
 - 1s - loss: 0.0334 - acc: 0.9914 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 195/200
 - 1s - loss: 0.0332 - acc: 0.9911 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 196/200
 - 1s - loss: 0.0328 - acc: 0.9920 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 197/200
 - 1s - loss: 0.0327 - acc: 0.9922 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 198/200
 - 1s - loss: 0.0330 - acc: 0.9918 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 199/200
 - 1s - loss: 0.0329 - acc: 0.9916 - val_loss: 0.0292 - val_acc: 0.9932
Epoch 200/200
 - 1s - loss: 0.0330 - acc: 0.9914 - val_loss: 0.0292 - val_acc: 0.9932
2018-03-27 09:38:51,993 [INFO] Evaluate...
2018-03-27 09:38:54,046 [INFO] Done!
2018-03-27 09:38:54,053 [INFO] tpe_transform took 0.002435 seconds
2018-03-27 09:38:54,053 [INFO] TPE using 17/17 trials with best loss 0.018868
2018-03-27 09:38:54,056 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:38:55,043 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0592 - acc: 0.9783 - val_loss: 0.0280 - val_acc: 0.9914
Epoch 2/200
 - 1s - loss: 0.0285 - acc: 0.9910 - val_loss: 0.0261 - val_acc: 0.9920
Epoch 3/200
 - 1s - loss: 0.0270 - acc: 0.9915 - val_loss: 0.0252 - val_acc: 0.9928
Epoch 4/200
 - 1s - loss: 0.0259 - acc: 0.9922 - val_loss: 0.0246 - val_acc: 0.9930
Epoch 5/200
 - 1s - loss: 0.0251 - acc: 0.9921 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 6/200
 - 1s - loss: 0.0252 - acc: 0.9922 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 7/200
 - 1s - loss: 0.0244 - acc: 0.9924 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 8/200
 - 1s - loss: 0.0249 - acc: 0.9923 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0240 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 10/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0239 - acc: 0.9926 - val_loss: 0.0232 - val_acc: 0.9930
Epoch 12/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0230 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0229 - val_acc: 0.9932
Epoch 14/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9932
Epoch 15/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9936
Epoch 16/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9936
Epoch 17/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9934
Epoch 18/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 19/200
 - 1s - loss: 0.0233 - acc: 0.9925 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 20/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 21/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0224 - val_acc: 0.9930
Epoch 22/200
 - 1s - loss: 0.0227 - acc: 0.9930 - val_loss: 0.0224 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9930
Epoch 26/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0222 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0221 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0221 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0222 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0221 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9930
Epoch 33/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9930
Epoch 34/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9930
Epoch 35/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0220 - val_acc: 0.9930
Epoch 36/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9930
Epoch 37/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 38/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 39/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 40/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 41/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 42/200
 - 1s - loss: 0.0219 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 43/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 44/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 45/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 46/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 47/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 48/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 49/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 50/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 51/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 52/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 53/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 54/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 55/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 56/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 57/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 58/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 59/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 60/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 61/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 62/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 63/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 64/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 65/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 66/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 67/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 68/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 69/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 70/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 71/200
 - 1s - loss: 0.0208 - acc: 0.9939 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 72/200
 - 1s - loss: 0.0217 - acc: 0.9930 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 73/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 74/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 75/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 76/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 77/200
 - 1s - loss: 0.0218 - acc: 0.9930 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 78/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 79/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 80/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 81/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 82/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 83/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 84/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 85/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 86/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 87/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 88/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 89/200
 - 1s - loss: 0.0210 - acc: 0.9942 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 90/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 91/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 92/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 93/200
 - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 94/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 95/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 96/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 97/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 98/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 99/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 100/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 101/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 102/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 103/200
 - 1s - loss: 0.0211 - acc: 0.9939 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 104/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 105/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 106/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 107/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 108/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 109/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 110/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 111/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 112/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 113/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 114/200
 - 1s - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 115/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 116/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 117/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 118/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 119/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 120/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 121/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 122/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 123/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 124/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 125/200
 - 1s - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 126/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 127/200
 - 1s - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 128/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 129/200
 - 1s - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 130/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 131/200
 - 1s - loss: 0.0209 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 132/200
 - 1s - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 133/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 134/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 135/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 136/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 137/200
 - 1s - loss: 0.0206 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 138/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 139/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 140/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 141/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 142/200
 - 1s - loss: 0.0206 - acc: 0.9941 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 143/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 144/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 145/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 146/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 147/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 148/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 149/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 150/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 151/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 152/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 153/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 154/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 155/200
 - 1s - loss: 0.0205 - acc: 0.9939 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 161/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 162/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 163/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0203 - acc: 0.9934 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0208 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 175/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 176/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 177/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 178/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 179/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 180/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 181/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 182/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 183/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 184/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 185/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 186/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 187/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 188/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 189/200
 - 1s - loss: 0.0201 - acc: 0.9945 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 190/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 191/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 192/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 193/200
 - 1s - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 194/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 195/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 196/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 197/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 198/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 199/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 200/200
 - 1s - loss: 0.0201 - acc: 0.9933 - val_loss: 0.0208 - val_acc: 0.9934
2018-03-27 09:41:57,259 [INFO] Evaluate...
2018-03-27 09:41:59,342 [INFO] Done!
2018-03-27 09:41:59,349 [INFO] tpe_transform took 0.002557 seconds
2018-03-27 09:41:59,349 [INFO] TPE using 18/18 trials with best loss 0.018868
2018-03-27 09:41:59,351 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:42:00,342 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.1560 - acc: 0.9516 - val_loss: 0.0708 - val_acc: 0.9856
Epoch 2/200
 - 1s - loss: 0.0809 - acc: 0.9813 - val_loss: 0.0598 - val_acc: 0.9860
Epoch 3/200
 - 1s - loss: 0.0729 - acc: 0.9821 - val_loss: 0.0549 - val_acc: 0.9868
Epoch 4/200
 - 1s - loss: 0.0684 - acc: 0.9833 - val_loss: 0.0521 - val_acc: 0.9870
Epoch 5/200
 - 1s - loss: 0.0652 - acc: 0.9840 - val_loss: 0.0502 - val_acc: 0.9872
Epoch 6/200
 - 1s - loss: 0.0634 - acc: 0.9846 - val_loss: 0.0488 - val_acc: 0.9872
Epoch 7/200
 - 1s - loss: 0.0617 - acc: 0.9850 - val_loss: 0.0478 - val_acc: 0.9874
Epoch 8/200
 - 1s - loss: 0.0607 - acc: 0.9847 - val_loss: 0.0469 - val_acc: 0.9876
Epoch 9/200
 - 1s - loss: 0.0598 - acc: 0.9838 - val_loss: 0.0461 - val_acc: 0.9876
Epoch 10/200
 - 1s - loss: 0.0588 - acc: 0.9838 - val_loss: 0.0455 - val_acc: 0.9878
Epoch 11/200
 - 1s - loss: 0.0580 - acc: 0.9851 - val_loss: 0.0449 - val_acc: 0.9878
Epoch 12/200
 - 1s - loss: 0.0588 - acc: 0.9847 - val_loss: 0.0444 - val_acc: 0.9880
Epoch 13/200
 - 1s - loss: 0.0562 - acc: 0.9856 - val_loss: 0.0440 - val_acc: 0.9880
Epoch 14/200
 - 1s - loss: 0.0566 - acc: 0.9856 - val_loss: 0.0436 - val_acc: 0.9882
Epoch 15/200
 - 1s - loss: 0.0564 - acc: 0.9846 - val_loss: 0.0433 - val_acc: 0.9882
Epoch 16/200
 - 1s - loss: 0.0542 - acc: 0.9869 - val_loss: 0.0429 - val_acc: 0.9884
Epoch 17/200
 - 1s - loss: 0.0560 - acc: 0.9846 - val_loss: 0.0426 - val_acc: 0.9884
Epoch 18/200
 - 1s - loss: 0.0539 - acc: 0.9864 - val_loss: 0.0424 - val_acc: 0.9884
Epoch 19/200
 - 1s - loss: 0.0539 - acc: 0.9862 - val_loss: 0.0421 - val_acc: 0.9884
Epoch 20/200
 - 1s - loss: 0.0532 - acc: 0.9868 - val_loss: 0.0419 - val_acc: 0.9886
Epoch 21/200
 - 1s - loss: 0.0536 - acc: 0.9863 - val_loss: 0.0417 - val_acc: 0.9886
Epoch 22/200
 - 1s - loss: 0.0536 - acc: 0.9859 - val_loss: 0.0415 - val_acc: 0.9886
Epoch 23/200
 - 1s - loss: 0.0529 - acc: 0.9861 - val_loss: 0.0413 - val_acc: 0.9886
Epoch 24/200
 - 1s - loss: 0.0528 - acc: 0.9861 - val_loss: 0.0411 - val_acc: 0.9886
Epoch 25/200
 - 1s - loss: 0.0534 - acc: 0.9858 - val_loss: 0.0410 - val_acc: 0.9886
Epoch 26/200
 - 1s - loss: 0.0535 - acc: 0.9860 - val_loss: 0.0408 - val_acc: 0.9886
Epoch 27/200
 - 1s - loss: 0.0523 - acc: 0.9865 - val_loss: 0.0406 - val_acc: 0.9888
Epoch 28/200
 - 1s - loss: 0.0517 - acc: 0.9860 - val_loss: 0.0405 - val_acc: 0.9888
Epoch 29/200
 - 1s - loss: 0.0520 - acc: 0.9861 - val_loss: 0.0404 - val_acc: 0.9888
Epoch 30/200
 - 1s - loss: 0.0514 - acc: 0.9864 - val_loss: 0.0402 - val_acc: 0.9888
Epoch 31/200
 - 1s - loss: 0.0514 - acc: 0.9862 - val_loss: 0.0401 - val_acc: 0.9890
Epoch 32/200
 - 1s - loss: 0.0504 - acc: 0.9869 - val_loss: 0.0400 - val_acc: 0.9890
Epoch 33/200
 - 1s - loss: 0.0500 - acc: 0.9871 - val_loss: 0.0399 - val_acc: 0.9890
Epoch 34/200
 - 1s - loss: 0.0511 - acc: 0.9872 - val_loss: 0.0397 - val_acc: 0.9890
Epoch 35/200
 - 1s - loss: 0.0504 - acc: 0.9869 - val_loss: 0.0396 - val_acc: 0.9890
Epoch 36/200
 - 1s - loss: 0.0523 - acc: 0.9861 - val_loss: 0.0395 - val_acc: 0.9890
Epoch 37/200
 - 1s - loss: 0.0509 - acc: 0.9864 - val_loss: 0.0394 - val_acc: 0.9890
Epoch 38/200
 - 1s - loss: 0.0495 - acc: 0.9879 - val_loss: 0.0393 - val_acc: 0.9890
Epoch 39/200
 - 1s - loss: 0.0504 - acc: 0.9867 - val_loss: 0.0392 - val_acc: 0.9890
Epoch 40/200
 - 1s - loss: 0.0504 - acc: 0.9866 - val_loss: 0.0391 - val_acc: 0.9890
Epoch 41/200
 - 1s - loss: 0.0497 - acc: 0.9868 - val_loss: 0.0391 - val_acc: 0.9890
Epoch 42/200
 - 1s - loss: 0.0501 - acc: 0.9859 - val_loss: 0.0390 - val_acc: 0.9892
Epoch 43/200
 - 1s - loss: 0.0502 - acc: 0.9867 - val_loss: 0.0389 - val_acc: 0.9892
Epoch 44/200
 - 1s - loss: 0.0498 - acc: 0.9865 - val_loss: 0.0388 - val_acc: 0.9892
Epoch 45/200
 - 1s - loss: 0.0499 - acc: 0.9871 - val_loss: 0.0387 - val_acc: 0.9892
Epoch 46/200
 - 1s - loss: 0.0493 - acc: 0.9868 - val_loss: 0.0387 - val_acc: 0.9892
Epoch 47/200
 - 1s - loss: 0.0504 - acc: 0.9863 - val_loss: 0.0386 - val_acc: 0.9892
Epoch 48/200
 - 1s - loss: 0.0503 - acc: 0.9864 - val_loss: 0.0385 - val_acc: 0.9892
Epoch 49/200
 - 1s - loss: 0.0490 - acc: 0.9873 - val_loss: 0.0385 - val_acc: 0.9892
Epoch 50/200
 - 1s - loss: 0.0501 - acc: 0.9863 - val_loss: 0.0384 - val_acc: 0.9892
Epoch 51/200
 - 1s - loss: 0.0495 - acc: 0.9873 - val_loss: 0.0383 - val_acc: 0.9892
Epoch 52/200
 - 1s - loss: 0.0499 - acc: 0.9865 - val_loss: 0.0383 - val_acc: 0.9892
Epoch 53/200
 - 1s - loss: 0.0486 - acc: 0.9867 - val_loss: 0.0382 - val_acc: 0.9892
Epoch 54/200
 - 1s - loss: 0.0480 - acc: 0.9878 - val_loss: 0.0381 - val_acc: 0.9892
Epoch 55/200
 - 1s - loss: 0.0489 - acc: 0.9867 - val_loss: 0.0381 - val_acc: 0.9892
Epoch 56/200
 - 1s - loss: 0.0490 - acc: 0.9869 - val_loss: 0.0380 - val_acc: 0.9892
Epoch 57/200
 - 1s - loss: 0.0485 - acc: 0.9876 - val_loss: 0.0380 - val_acc: 0.9892
Epoch 58/200
 - 1s - loss: 0.0479 - acc: 0.9879 - val_loss: 0.0379 - val_acc: 0.9892
Epoch 59/200
 - 1s - loss: 0.0482 - acc: 0.9873 - val_loss: 0.0379 - val_acc: 0.9892
Epoch 60/200
 - 1s - loss: 0.0479 - acc: 0.9877 - val_loss: 0.0378 - val_acc: 0.9892
Epoch 61/200
 - 1s - loss: 0.0492 - acc: 0.9864 - val_loss: 0.0377 - val_acc: 0.9892
Epoch 62/200
 - 1s - loss: 0.0485 - acc: 0.9867 - val_loss: 0.0377 - val_acc: 0.9892
Epoch 63/200
 - 1s - loss: 0.0492 - acc: 0.9869 - val_loss: 0.0376 - val_acc: 0.9892
Epoch 64/200
 - 1s - loss: 0.0478 - acc: 0.9865 - val_loss: 0.0376 - val_acc: 0.9892
Epoch 65/200
 - 1s - loss: 0.0495 - acc: 0.9856 - val_loss: 0.0376 - val_acc: 0.9892
Epoch 66/200
 - 1s - loss: 0.0485 - acc: 0.9878 - val_loss: 0.0375 - val_acc: 0.9892
Epoch 67/200
 - 1s - loss: 0.0475 - acc: 0.9879 - val_loss: 0.0375 - val_acc: 0.9894
Epoch 68/200
 - 1s - loss: 0.0486 - acc: 0.9862 - val_loss: 0.0374 - val_acc: 0.9894
Epoch 69/200
 - 1s - loss: 0.0491 - acc: 0.9863 - val_loss: 0.0374 - val_acc: 0.9894
Epoch 70/200
 - 1s - loss: 0.0484 - acc: 0.9863 - val_loss: 0.0373 - val_acc: 0.9894
Epoch 71/200
 - 1s - loss: 0.0475 - acc: 0.9874 - val_loss: 0.0373 - val_acc: 0.9894
Epoch 72/200
 - 1s - loss: 0.0480 - acc: 0.9872 - val_loss: 0.0373 - val_acc: 0.9894
Epoch 73/200
 - 1s - loss: 0.0484 - acc: 0.9872 - val_loss: 0.0372 - val_acc: 0.9894
Epoch 74/200
 - 1s - loss: 0.0481 - acc: 0.9868 - val_loss: 0.0372 - val_acc: 0.9894
Epoch 75/200
 - 1s - loss: 0.0481 - acc: 0.9873 - val_loss: 0.0371 - val_acc: 0.9894
Epoch 76/200
 - 1s - loss: 0.0472 - acc: 0.9877 - val_loss: 0.0371 - val_acc: 0.9894
Epoch 77/200
 - 1s - loss: 0.0471 - acc: 0.9878 - val_loss: 0.0371 - val_acc: 0.9894
Epoch 78/200
 - 1s - loss: 0.0471 - acc: 0.9868 - val_loss: 0.0370 - val_acc: 0.9894
Epoch 79/200
 - 1s - loss: 0.0477 - acc: 0.9870 - val_loss: 0.0370 - val_acc: 0.9894
Epoch 80/200
 - 1s - loss: 0.0473 - acc: 0.9870 - val_loss: 0.0369 - val_acc: 0.9896
Epoch 81/200
 - 1s - loss: 0.0470 - acc: 0.9881 - val_loss: 0.0369 - val_acc: 0.9896
Epoch 82/200
 - 1s - loss: 0.0472 - acc: 0.9879 - val_loss: 0.0369 - val_acc: 0.9896
Epoch 83/200
 - 1s - loss: 0.0475 - acc: 0.9873 - val_loss: 0.0368 - val_acc: 0.9896
Epoch 84/200
 - 1s - loss: 0.0472 - acc: 0.9870 - val_loss: 0.0368 - val_acc: 0.9896
Epoch 85/200
 - 1s - loss: 0.0482 - acc: 0.9869 - val_loss: 0.0368 - val_acc: 0.9896
Epoch 86/200
 - 1s - loss: 0.0462 - acc: 0.9881 - val_loss: 0.0367 - val_acc: 0.9896
Epoch 87/200
 - 1s - loss: 0.0460 - acc: 0.9869 - val_loss: 0.0367 - val_acc: 0.9896
Epoch 88/200
 - 1s - loss: 0.0471 - acc: 0.9874 - val_loss: 0.0367 - val_acc: 0.9896
Epoch 89/200
 - 1s - loss: 0.0464 - acc: 0.9877 - val_loss: 0.0366 - val_acc: 0.9896
Epoch 90/200
 - 1s - loss: 0.0464 - acc: 0.9879 - val_loss: 0.0366 - val_acc: 0.9898
Epoch 91/200
 - 1s - loss: 0.0475 - acc: 0.9869 - val_loss: 0.0366 - val_acc: 0.9898
Epoch 92/200
 - 1s - loss: 0.0467 - acc: 0.9875 - val_loss: 0.0366 - val_acc: 0.9898
Epoch 93/200
 - 1s - loss: 0.0464 - acc: 0.9868 - val_loss: 0.0365 - val_acc: 0.9898
Epoch 94/200
 - 1s - loss: 0.0466 - acc: 0.9879 - val_loss: 0.0365 - val_acc: 0.9898
Epoch 95/200
 - 1s - loss: 0.0473 - acc: 0.9875 - val_loss: 0.0365 - val_acc: 0.9898
Epoch 96/200
 - 1s - loss: 0.0476 - acc: 0.9869 - val_loss: 0.0364 - val_acc: 0.9898
Epoch 97/200
 - 1s - loss: 0.0463 - acc: 0.9874 - val_loss: 0.0364 - val_acc: 0.9898
Epoch 98/200
 - 1s - loss: 0.0459 - acc: 0.9873 - val_loss: 0.0364 - val_acc: 0.9898
Epoch 99/200
 - 1s - loss: 0.0467 - acc: 0.9879 - val_loss: 0.0364 - val_acc: 0.9898
Epoch 100/200
 - 1s - loss: 0.0462 - acc: 0.9873 - val_loss: 0.0363 - val_acc: 0.9900
Epoch 101/200
 - 1s - loss: 0.0471 - acc: 0.9866 - val_loss: 0.0363 - val_acc: 0.9900
Epoch 102/200
 - 1s - loss: 0.0469 - acc: 0.9882 - val_loss: 0.0363 - val_acc: 0.9900
Epoch 103/200
 - 1s - loss: 0.0468 - acc: 0.9872 - val_loss: 0.0363 - val_acc: 0.9900
Epoch 104/200
 - 1s - loss: 0.0466 - acc: 0.9871 - val_loss: 0.0362 - val_acc: 0.9900
Epoch 105/200
 - 1s - loss: 0.0469 - acc: 0.9870 - val_loss: 0.0362 - val_acc: 0.9902
Epoch 106/200
 - 1s - loss: 0.0467 - acc: 0.9873 - val_loss: 0.0362 - val_acc: 0.9902
Epoch 107/200
 - 1s - loss: 0.0475 - acc: 0.9870 - val_loss: 0.0362 - val_acc: 0.9902
Epoch 108/200
 - 1s - loss: 0.0457 - acc: 0.9881 - val_loss: 0.0361 - val_acc: 0.9902
Epoch 109/200
 - 1s - loss: 0.0468 - acc: 0.9876 - val_loss: 0.0361 - val_acc: 0.9902
Epoch 110/200
 - 1s - loss: 0.0466 - acc: 0.9879 - val_loss: 0.0361 - val_acc: 0.9902
Epoch 111/200
 - 1s - loss: 0.0460 - acc: 0.9885 - val_loss: 0.0361 - val_acc: 0.9902
Epoch 112/200
 - 1s - loss: 0.0464 - acc: 0.9871 - val_loss: 0.0360 - val_acc: 0.9902
Epoch 113/200
 - 1s - loss: 0.0455 - acc: 0.9876 - val_loss: 0.0360 - val_acc: 0.9902
Epoch 114/200
 - 1s - loss: 0.0453 - acc: 0.9888 - val_loss: 0.0360 - val_acc: 0.9902
Epoch 115/200
 - 1s - loss: 0.0464 - acc: 0.9877 - val_loss: 0.0360 - val_acc: 0.9902
Epoch 116/200
 - 1s - loss: 0.0459 - acc: 0.9884 - val_loss: 0.0359 - val_acc: 0.9902
Epoch 117/200
 - 1s - loss: 0.0467 - acc: 0.9872 - val_loss: 0.0359 - val_acc: 0.9902
Epoch 118/200
 - 1s - loss: 0.0462 - acc: 0.9878 - val_loss: 0.0359 - val_acc: 0.9902
Epoch 119/200
 - 1s - loss: 0.0454 - acc: 0.9879 - val_loss: 0.0359 - val_acc: 0.9902
Epoch 120/200
 - 1s - loss: 0.0467 - acc: 0.9874 - val_loss: 0.0359 - val_acc: 0.9902
Epoch 121/200
 - 1s - loss: 0.0459 - acc: 0.9873 - val_loss: 0.0358 - val_acc: 0.9902
Epoch 122/200
 - 1s - loss: 0.0453 - acc: 0.9882 - val_loss: 0.0358 - val_acc: 0.9902
Epoch 123/200
 - 1s - loss: 0.0454 - acc: 0.9879 - val_loss: 0.0358 - val_acc: 0.9902
Epoch 124/200
 - 1s - loss: 0.0467 - acc: 0.9869 - val_loss: 0.0358 - val_acc: 0.9902
Epoch 125/200
 - 1s - loss: 0.0455 - acc: 0.9874 - val_loss: 0.0357 - val_acc: 0.9902
Epoch 126/200
 - 1s - loss: 0.0460 - acc: 0.9868 - val_loss: 0.0357 - val_acc: 0.9902
Epoch 127/200
 - 1s - loss: 0.0458 - acc: 0.9879 - val_loss: 0.0357 - val_acc: 0.9902
Epoch 128/200
 - 1s - loss: 0.0449 - acc: 0.9888 - val_loss: 0.0357 - val_acc: 0.9902
Epoch 129/200
 - 1s - loss: 0.0447 - acc: 0.9884 - val_loss: 0.0357 - val_acc: 0.9902
Epoch 130/200
 - 1s - loss: 0.0465 - acc: 0.9868 - val_loss: 0.0356 - val_acc: 0.9902
Epoch 131/200
 - 1s - loss: 0.0453 - acc: 0.9871 - val_loss: 0.0356 - val_acc: 0.9902
Epoch 132/200
 - 1s - loss: 0.0453 - acc: 0.9877 - val_loss: 0.0356 - val_acc: 0.9902
Epoch 133/200
 - 1s - loss: 0.0462 - acc: 0.9872 - val_loss: 0.0356 - val_acc: 0.9902
Epoch 134/200
 - 1s - loss: 0.0468 - acc: 0.9865 - val_loss: 0.0356 - val_acc: 0.9902
Epoch 135/200
 - 1s - loss: 0.0456 - acc: 0.9874 - val_loss: 0.0356 - val_acc: 0.9902
Epoch 136/200
 - 1s - loss: 0.0448 - acc: 0.9877 - val_loss: 0.0355 - val_acc: 0.9902
Epoch 137/200
 - 1s - loss: 0.0462 - acc: 0.9876 - val_loss: 0.0355 - val_acc: 0.9902
Epoch 138/200
 - 1s - loss: 0.0459 - acc: 0.9870 - val_loss: 0.0355 - val_acc: 0.9902
Epoch 139/200
 - 1s - loss: 0.0467 - acc: 0.9872 - val_loss: 0.0355 - val_acc: 0.9902
Epoch 140/200
 - 1s - loss: 0.0454 - acc: 0.9877 - val_loss: 0.0355 - val_acc: 0.9902
Epoch 141/200
 - 1s - loss: 0.0462 - acc: 0.9873 - val_loss: 0.0354 - val_acc: 0.9902
Epoch 142/200
 - 1s - loss: 0.0448 - acc: 0.9878 - val_loss: 0.0354 - val_acc: 0.9902
Epoch 143/200
 - 1s - loss: 0.0453 - acc: 0.9881 - val_loss: 0.0354 - val_acc: 0.9902
Epoch 144/200
 - 1s - loss: 0.0444 - acc: 0.9888 - val_loss: 0.0354 - val_acc: 0.9902
Epoch 145/200
 - 1s - loss: 0.0467 - acc: 0.9872 - val_loss: 0.0354 - val_acc: 0.9902
Epoch 146/200
 - 1s - loss: 0.0451 - acc: 0.9882 - val_loss: 0.0354 - val_acc: 0.9902
Epoch 147/200
 - 1s - loss: 0.0467 - acc: 0.9873 - val_loss: 0.0353 - val_acc: 0.9902
Epoch 148/200
 - 1s - loss: 0.0450 - acc: 0.9880 - val_loss: 0.0353 - val_acc: 0.9902
Epoch 149/200
 - 1s - loss: 0.0452 - acc: 0.9882 - val_loss: 0.0353 - val_acc: 0.9902
Epoch 150/200
 - 1s - loss: 0.0461 - acc: 0.9876 - val_loss: 0.0353 - val_acc: 0.9902
Epoch 151/200
 - 1s - loss: 0.0446 - acc: 0.9876 - val_loss: 0.0353 - val_acc: 0.9902
Epoch 152/200
 - 1s - loss: 0.0461 - acc: 0.9873 - val_loss: 0.0353 - val_acc: 0.9902
Epoch 153/200
 - 1s - loss: 0.0448 - acc: 0.9887 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 154/200
 - 1s - loss: 0.0456 - acc: 0.9881 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 155/200
 - 1s - loss: 0.0454 - acc: 0.9879 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 156/200
 - 1s - loss: 0.0451 - acc: 0.9880 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 157/200
 - 1s - loss: 0.0450 - acc: 0.9881 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 158/200
 - 1s - loss: 0.0463 - acc: 0.9879 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 159/200
 - 1s - loss: 0.0447 - acc: 0.9879 - val_loss: 0.0352 - val_acc: 0.9902
Epoch 160/200
 - 1s - loss: 0.0452 - acc: 0.9874 - val_loss: 0.0351 - val_acc: 0.9902
Epoch 161/200
 - 1s - loss: 0.0450 - acc: 0.9878 - val_loss: 0.0351 - val_acc: 0.9902
Epoch 162/200
 - 1s - loss: 0.0452 - acc: 0.9882 - val_loss: 0.0351 - val_acc: 0.9902
Epoch 163/200
 - 1s - loss: 0.0456 - acc: 0.9874 - val_loss: 0.0351 - val_acc: 0.9902
Epoch 164/200
 - 1s - loss: 0.0456 - acc: 0.9878 - val_loss: 0.0351 - val_acc: 0.9902
Epoch 165/200
 - 1s - loss: 0.0452 - acc: 0.9879 - val_loss: 0.0351 - val_acc: 0.9902
Epoch 166/200
 - 1s - loss: 0.0453 - acc: 0.9867 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 167/200
 - 1s - loss: 0.0449 - acc: 0.9881 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 168/200
 - 1s - loss: 0.0460 - acc: 0.9874 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 169/200
 - 1s - loss: 0.0441 - acc: 0.9887 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 170/200
 - 1s - loss: 0.0448 - acc: 0.9885 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 171/200
 - 1s - loss: 0.0459 - acc: 0.9874 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 172/200
 - 1s - loss: 0.0446 - acc: 0.9876 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 173/200
 - 1s - loss: 0.0441 - acc: 0.9889 - val_loss: 0.0350 - val_acc: 0.9902
Epoch 174/200
 - 1s - loss: 0.0447 - acc: 0.9878 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 175/200
 - 1s - loss: 0.0448 - acc: 0.9882 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 176/200
 - 1s - loss: 0.0451 - acc: 0.9888 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 177/200
 - 1s - loss: 0.0442 - acc: 0.9877 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 178/200
 - 1s - loss: 0.0453 - acc: 0.9877 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 179/200
 - 1s - loss: 0.0449 - acc: 0.9882 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 180/200
 - 1s - loss: 0.0443 - acc: 0.9879 - val_loss: 0.0349 - val_acc: 0.9902
Epoch 181/200
 - 1s - loss: 0.0443 - acc: 0.9884 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 182/200
 - 1s - loss: 0.0446 - acc: 0.9882 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 183/200
 - 1s - loss: 0.0443 - acc: 0.9883 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 184/200
 - 1s - loss: 0.0453 - acc: 0.9874 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 185/200
 - 1s - loss: 0.0444 - acc: 0.9886 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 186/200
 - 1s - loss: 0.0433 - acc: 0.9884 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 187/200
 - 1s - loss: 0.0448 - acc: 0.9871 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 188/200
 - 1s - loss: 0.0430 - acc: 0.9888 - val_loss: 0.0348 - val_acc: 0.9902
Epoch 189/200
 - 1s - loss: 0.0447 - acc: 0.9879 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 190/200
 - 1s - loss: 0.0439 - acc: 0.9888 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 191/200
 - 1s - loss: 0.0427 - acc: 0.9883 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 192/200
 - 1s - loss: 0.0445 - acc: 0.9886 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 193/200
 - 1s - loss: 0.0444 - acc: 0.9874 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 194/200
 - 1s - loss: 0.0446 - acc: 0.9882 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 195/200
 - 1s - loss: 0.0436 - acc: 0.9886 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 196/200
 - 1s - loss: 0.0439 - acc: 0.9882 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 197/200
 - 1s - loss: 0.0437 - acc: 0.9887 - val_loss: 0.0347 - val_acc: 0.9902
Epoch 198/200
 - 1s - loss: 0.0442 - acc: 0.9880 - val_loss: 0.0346 - val_acc: 0.9902
Epoch 199/200
 - 1s - loss: 0.0452 - acc: 0.9881 - val_loss: 0.0346 - val_acc: 0.9902
Epoch 200/200
 - 1s - loss: 0.0442 - acc: 0.9879 - val_loss: 0.0346 - val_acc: 0.9902
2018-03-27 09:45:02,666 [INFO] Evaluate...
2018-03-27 09:45:04,842 [INFO] Done!
2018-03-27 09:45:04,849 [INFO] tpe_transform took 0.003154 seconds
2018-03-27 09:45:04,849 [INFO] TPE using 19/19 trials with best loss 0.018868
2018-03-27 09:45:04,851 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:45:05,846 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0547 - acc: 0.9790 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 2/200
 - 1s - loss: 0.0300 - acc: 0.9905 - val_loss: 0.0248 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0280 - acc: 0.9915 - val_loss: 0.0238 - val_acc: 0.9938
Epoch 4/200
 - 1s - loss: 0.0267 - acc: 0.9920 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 5/200
 - 1s - loss: 0.0266 - acc: 0.9918 - val_loss: 0.0227 - val_acc: 0.9940
Epoch 6/200
 - 1s - loss: 0.0259 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9942
Epoch 7/200
 - 1s - loss: 0.0257 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9942
Epoch 8/200
 - 1s - loss: 0.0247 - acc: 0.9920 - val_loss: 0.0220 - val_acc: 0.9942
Epoch 9/200
 - 1s - loss: 0.0241 - acc: 0.9927 - val_loss: 0.0218 - val_acc: 0.9942
Epoch 10/200
 - 1s - loss: 0.0250 - acc: 0.9924 - val_loss: 0.0217 - val_acc: 0.9942
Epoch 11/200
 - 1s - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0245 - acc: 0.9922 - val_loss: 0.0215 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0236 - acc: 0.9926 - val_loss: 0.0213 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0238 - acc: 0.9928 - val_loss: 0.0212 - val_acc: 0.9946
Epoch 16/200
 - 1s - loss: 0.0232 - acc: 0.9923 - val_loss: 0.0211 - val_acc: 0.9946
Epoch 17/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 19/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0209 - val_acc: 0.9946
Epoch 20/200
 - 1s - loss: 0.0244 - acc: 0.9925 - val_loss: 0.0209 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0208 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0233 - acc: 0.9919 - val_loss: 0.0208 - val_acc: 0.9946
Epoch 23/200
 - 1s - loss: 0.0227 - acc: 0.9926 - val_loss: 0.0207 - val_acc: 0.9946
Epoch 24/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0207 - val_acc: 0.9946
Epoch 25/200
 - 1s - loss: 0.0224 - acc: 0.9932 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 26/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 27/200
 - 1s - loss: 0.0224 - acc: 0.9932 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 28/200
 - 1s - loss: 0.0238 - acc: 0.9927 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 31/200
 - 1s - loss: 0.0225 - acc: 0.9929 - val_loss: 0.0204 - val_acc: 0.9946
Epoch 32/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0204 - val_acc: 0.9946
Epoch 33/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0204 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0204 - val_acc: 0.9946
Epoch 35/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 37/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 38/200
 - 1s - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 39/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 40/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0227 - acc: 0.9930 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0222 - acc: 0.9923 - val_loss: 0.0201 - val_acc: 0.9942
Epoch 46/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 47/200
 - 1s - loss: 0.0225 - acc: 0.9929 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 48/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 50/200
 - 1s - loss: 0.0225 - acc: 0.9934 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 51/200
 - 1s - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0200 - val_acc: 0.9944
Epoch 52/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 53/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 54/200
 - 1s - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 55/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0200 - val_acc: 0.9948
Epoch 56/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9948
Epoch 57/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0199 - val_acc: 0.9948
Epoch 58/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 59/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 60/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 61/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 62/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0199 - val_acc: 0.9950
Epoch 63/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 64/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 65/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 66/200
 - 1s - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 67/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 68/200
 - 1s - loss: 0.0208 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 69/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 70/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 71/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0198 - val_acc: 0.9950
Epoch 72/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 73/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 74/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 75/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 76/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 77/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 78/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 79/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 80/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 81/200
 - 1s - loss: 0.0212 - acc: 0.9939 - val_loss: 0.0197 - val_acc: 0.9950
Epoch 82/200
 - 1s - loss: 0.0215 - acc: 0.9927 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 83/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 84/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 85/200
 - 1s - loss: 0.0216 - acc: 0.9934 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 86/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 87/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 88/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9952
Epoch 89/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 90/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 91/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 92/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 93/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 94/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 95/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0196 - val_acc: 0.9950
Epoch 96/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9950
Epoch 97/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 98/200
 - 1s - loss: 0.0205 - acc: 0.9939 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 99/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 100/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 101/200
 - 1s - loss: 0.0208 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 102/200
 - 1s - loss: 0.0219 - acc: 0.9928 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 103/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 104/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 105/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 106/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 107/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 108/200
 - 1s - loss: 0.0210 - acc: 0.9928 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 109/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 110/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 111/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9952
Epoch 112/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 113/200
 - 1s - loss: 0.0215 - acc: 0.9941 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 114/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 115/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 116/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 117/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 118/200
 - 1s - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0194 - val_acc: 0.9950
Epoch 119/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 120/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 121/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 122/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 123/200
 - 1s - loss: 0.0209 - acc: 0.9942 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 124/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 125/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 126/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 127/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 128/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 129/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9952
Epoch 130/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 131/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 132/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 133/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 134/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 135/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 136/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 137/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 138/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 139/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 140/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 141/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 142/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 143/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 144/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 145/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 146/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 147/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 148/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 149/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 150/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 151/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 152/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 153/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9952
Epoch 154/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 155/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 156/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 157/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 158/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 159/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 160/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 161/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 162/200
 - 1s - loss: 0.0211 - acc: 0.9930 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 163/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 164/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 165/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 166/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 167/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 168/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 169/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 170/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 171/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 172/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 173/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 174/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 175/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 176/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 177/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 178/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 179/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 180/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 181/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 182/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9952
Epoch 183/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 184/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 185/200
 - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 186/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 187/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 188/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 189/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 190/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 191/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 192/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 193/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 194/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 195/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 196/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 197/200
 - 1s - loss: 0.0215 - acc: 0.9930 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 198/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 199/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9952
Epoch 200/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0191 - val_acc: 0.9952
2018-03-27 09:48:08,799 [INFO] Evaluate...
2018-03-27 09:48:10,962 [INFO] Done!
2018-03-27 09:48:10,968 [INFO] tpe_transform took 0.002474 seconds
2018-03-27 09:48:10,969 [INFO] TPE using 20/20 trials with best loss 0.018868
2018-03-27 09:48:10,976 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:48:11,967 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0744 - acc: 0.9781 - val_loss: 0.0365 - val_acc: 0.9902
Epoch 2/200
 - 1s - loss: 0.0379 - acc: 0.9896 - val_loss: 0.0321 - val_acc: 0.9908
Epoch 3/200
 - 1s - loss: 0.0339 - acc: 0.9896 - val_loss: 0.0300 - val_acc: 0.9912
Epoch 4/200
 - 1s - loss: 0.0324 - acc: 0.9910 - val_loss: 0.0289 - val_acc: 0.9914
Epoch 5/200
 - 1s - loss: 0.0315 - acc: 0.9905 - val_loss: 0.0280 - val_acc: 0.9912
Epoch 6/200
 - 1s - loss: 0.0303 - acc: 0.9910 - val_loss: 0.0275 - val_acc: 0.9912
Epoch 7/200
 - 1s - loss: 0.0292 - acc: 0.9912 - val_loss: 0.0271 - val_acc: 0.9912
Epoch 8/200
 - 1s - loss: 0.0289 - acc: 0.9920 - val_loss: 0.0268 - val_acc: 0.9914
Epoch 9/200
 - 1s - loss: 0.0290 - acc: 0.9910 - val_loss: 0.0265 - val_acc: 0.9914
Epoch 10/200
 - 1s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.0262 - val_acc: 0.9916
Epoch 11/200
 - 1s - loss: 0.0283 - acc: 0.9918 - val_loss: 0.0262 - val_acc: 0.9914
Epoch 12/200
 - 1s - loss: 0.0273 - acc: 0.9919 - val_loss: 0.0259 - val_acc: 0.9914
Epoch 13/200
 - 1s - loss: 0.0278 - acc: 0.9919 - val_loss: 0.0258 - val_acc: 0.9914
Epoch 14/200
 - 1s - loss: 0.0276 - acc: 0.9919 - val_loss: 0.0257 - val_acc: 0.9914
Epoch 15/200
 - 1s - loss: 0.0260 - acc: 0.9919 - val_loss: 0.0255 - val_acc: 0.9916
Epoch 16/200
 - 1s - loss: 0.0266 - acc: 0.9923 - val_loss: 0.0255 - val_acc: 0.9916
Epoch 17/200
 - 1s - loss: 0.0277 - acc: 0.9917 - val_loss: 0.0254 - val_acc: 0.9916
Epoch 18/200
 - 1s - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0253 - val_acc: 0.9916
Epoch 19/200
 - 1s - loss: 0.0258 - acc: 0.9925 - val_loss: 0.0251 - val_acc: 0.9916
Epoch 20/200
 - 1s - loss: 0.0260 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9916
Epoch 21/200
 - 1s - loss: 0.0272 - acc: 0.9919 - val_loss: 0.0249 - val_acc: 0.9916
Epoch 22/200
 - 1s - loss: 0.0263 - acc: 0.9922 - val_loss: 0.0248 - val_acc: 0.9916
Epoch 23/200
 - 1s - loss: 0.0271 - acc: 0.9922 - val_loss: 0.0248 - val_acc: 0.9916
Epoch 24/200
 - 1s - loss: 0.0259 - acc: 0.9923 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 25/200
 - 1s - loss: 0.0256 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 26/200
 - 1s - loss: 0.0263 - acc: 0.9917 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 27/200
 - 1s - loss: 0.0249 - acc: 0.9931 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 28/200
 - 1s - loss: 0.0250 - acc: 0.9925 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 29/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 30/200
 - 1s - loss: 0.0264 - acc: 0.9917 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 31/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0244 - val_acc: 0.9918
Epoch 32/200
 - 1s - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 33/200
 - 1s - loss: 0.0267 - acc: 0.9913 - val_loss: 0.0243 - val_acc: 0.9920
Epoch 34/200
 - 1s - loss: 0.0250 - acc: 0.9928 - val_loss: 0.0242 - val_acc: 0.9922
Epoch 35/200
 - 1s - loss: 0.0240 - acc: 0.9922 - val_loss: 0.0242 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0267 - acc: 0.9914 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 37/200
 - 1s - loss: 0.0247 - acc: 0.9928 - val_loss: 0.0241 - val_acc: 0.9922
Epoch 38/200
 - 1s - loss: 0.0252 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 39/200
 - 1s - loss: 0.0250 - acc: 0.9924 - val_loss: 0.0241 - val_acc: 0.9922
Epoch 40/200
 - 1s - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0240 - val_acc: 0.9924
Epoch 41/200
 - 1s - loss: 0.0251 - acc: 0.9924 - val_loss: 0.0240 - val_acc: 0.9924
Epoch 42/200
 - 1s - loss: 0.0249 - acc: 0.9922 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 43/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 44/200
 - 1s - loss: 0.0247 - acc: 0.9930 - val_loss: 0.0239 - val_acc: 0.9922
Epoch 45/200
 - 1s - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0239 - val_acc: 0.9922
Epoch 46/200
 - 1s - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9922
Epoch 47/200
 - 1s - loss: 0.0246 - acc: 0.9922 - val_loss: 0.0239 - val_acc: 0.9924
Epoch 48/200
 - 1s - loss: 0.0245 - acc: 0.9924 - val_loss: 0.0239 - val_acc: 0.9922
Epoch 49/200
 - 1s - loss: 0.0252 - acc: 0.9914 - val_loss: 0.0239 - val_acc: 0.9922
Epoch 50/200
 - 1s - loss: 0.0244 - acc: 0.9927 - val_loss: 0.0238 - val_acc: 0.9922
Epoch 51/200
 - 1s - loss: 0.0243 - acc: 0.9926 - val_loss: 0.0238 - val_acc: 0.9922
Epoch 52/200
 - 1s - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0238 - val_acc: 0.9924
Epoch 53/200
 - 1s - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0238 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0245 - acc: 0.9924 - val_loss: 0.0237 - val_acc: 0.9924
Epoch 55/200
 - 1s - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0237 - val_acc: 0.9924
Epoch 56/200
 - 1s - loss: 0.0250 - acc: 0.9926 - val_loss: 0.0237 - val_acc: 0.9924
Epoch 57/200
 - 1s - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 58/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 59/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 60/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 61/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 62/200
 - 1s - loss: 0.0248 - acc: 0.9923 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 63/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 64/200
 - 1s - loss: 0.0236 - acc: 0.9920 - val_loss: 0.0235 - val_acc: 0.9924
Epoch 65/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 66/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 67/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 68/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 69/200
 - 1s - loss: 0.0248 - acc: 0.9923 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 70/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 71/200
 - 1s - loss: 0.0243 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 72/200
 - 1s - loss: 0.0241 - acc: 0.9930 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 73/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 74/200
 - 1s - loss: 0.0247 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 75/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 76/200
 - 1s - loss: 0.0242 - acc: 0.9924 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 77/200
 - 1s - loss: 0.0237 - acc: 0.9925 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 78/200
 - 1s - loss: 0.0226 - acc: 0.9940 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 79/200
 - 1s - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 80/200
 - 1s - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 81/200
 - 1s - loss: 0.0242 - acc: 0.9925 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 82/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 83/200
 - 1s - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 84/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 85/200
 - 1s - loss: 0.0239 - acc: 0.9928 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 86/200
 - 1s - loss: 0.0222 - acc: 0.9939 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 87/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 88/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 89/200
 - 1s - loss: 0.0238 - acc: 0.9928 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 90/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 91/200
 - 1s - loss: 0.0240 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 92/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 93/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 94/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 95/200
 - 1s - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 96/200
 - 1s - loss: 0.0230 - acc: 0.9934 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 97/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 98/200
 - 1s - loss: 0.0231 - acc: 0.9923 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 99/200
 - 1s - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 100/200
 - 1s - loss: 0.0242 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 101/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 102/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 103/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 104/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 105/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 106/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 107/200
 - 1s - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 108/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 109/200
 - 1s - loss: 0.0241 - acc: 0.9922 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 110/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 111/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 112/200
 - 1s - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 113/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 114/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 115/200
 - 1s - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 116/200
 - 1s - loss: 0.0234 - acc: 0.9919 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 117/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 118/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 119/200
 - 1s - loss: 0.0242 - acc: 0.9918 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 120/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 121/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 122/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 123/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 124/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 125/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 126/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 127/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 128/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 129/200
 - 1s - loss: 0.0232 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 130/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 131/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 132/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 133/200
 - 1s - loss: 0.0226 - acc: 0.9938 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 134/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 135/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 136/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 137/200
 - 1s - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 138/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 139/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 140/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 141/200
 - 1s - loss: 0.0227 - acc: 0.9926 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 142/200
 - 1s - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 143/200
 - 1s - loss: 0.0238 - acc: 0.9924 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 144/200
 - 1s - loss: 0.0229 - acc: 0.9922 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 145/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 146/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 147/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 148/200
 - 1s - loss: 0.0225 - acc: 0.9930 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 149/200
 - 1s - loss: 0.0225 - acc: 0.9927 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 150/200
 - 1s - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 151/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 152/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 153/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 154/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 155/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 156/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 157/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 158/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 159/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 160/200
 - 1s - loss: 0.0220 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 161/200
 - 1s - loss: 0.0225 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 162/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 163/200
 - 1s - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 164/200
 - 1s - loss: 0.0233 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 165/200
 - 1s - loss: 0.0234 - acc: 0.9934 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 166/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 167/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 168/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 169/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 170/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 171/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 172/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 173/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 174/200
 - 1s - loss: 0.0231 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 175/200
 - 1s - loss: 0.0219 - acc: 0.9934 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 176/200
 - 1s - loss: 0.0237 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 177/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 178/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 179/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 180/200
 - 1s - loss: 0.0227 - acc: 0.9940 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 181/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 182/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 183/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 184/200
 - 1s - loss: 0.0235 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 185/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 186/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 187/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 188/200
 - 1s - loss: 0.0226 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 189/200
 - 1s - loss: 0.0220 - acc: 0.9938 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 190/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 191/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 192/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 193/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 194/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 195/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 196/200
 - 1s - loss: 0.0227 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 197/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 198/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 199/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 200/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0225 - val_acc: 0.9930
2018-03-27 09:51:14,846 [INFO] Evaluate...
2018-03-27 09:51:17,124 [INFO] Done!
2018-03-27 09:51:17,130 [INFO] tpe_transform took 0.002474 seconds
2018-03-27 09:51:17,131 [INFO] TPE using 21/21 trials with best loss 0.018868
2018-03-27 09:51:17,138 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:51:18,138 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0738 - acc: 0.9789 - val_loss: 0.0389 - val_acc: 0.9918
Epoch 2/200
 - 1s - loss: 0.0391 - acc: 0.9895 - val_loss: 0.0337 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0349 - acc: 0.9909 - val_loss: 0.0310 - val_acc: 0.9928
Epoch 4/200
 - 1s - loss: 0.0328 - acc: 0.9909 - val_loss: 0.0297 - val_acc: 0.9934
Epoch 5/200
 - 1s - loss: 0.0323 - acc: 0.9913 - val_loss: 0.0288 - val_acc: 0.9934
Epoch 6/200
 - 1s - loss: 0.0316 - acc: 0.9912 - val_loss: 0.0281 - val_acc: 0.9934
Epoch 7/200
 - 1s - loss: 0.0305 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0299 - acc: 0.9914 - val_loss: 0.0272 - val_acc: 0.9940
Epoch 9/200
 - 1s - loss: 0.0294 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9940
Epoch 10/200
 - 1s - loss: 0.0288 - acc: 0.9915 - val_loss: 0.0266 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0299 - acc: 0.9915 - val_loss: 0.0263 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0293 - acc: 0.9910 - val_loss: 0.0261 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0279 - acc: 0.9919 - val_loss: 0.0259 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0282 - acc: 0.9917 - val_loss: 0.0257 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0273 - acc: 0.9920 - val_loss: 0.0256 - val_acc: 0.9942
Epoch 16/200
 - 1s - loss: 0.0277 - acc: 0.9920 - val_loss: 0.0254 - val_acc: 0.9942
Epoch 17/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0253 - val_acc: 0.9942
Epoch 18/200
 - 1s - loss: 0.0272 - acc: 0.9919 - val_loss: 0.0252 - val_acc: 0.9942
Epoch 19/200
 - 1s - loss: 0.0276 - acc: 0.9923 - val_loss: 0.0251 - val_acc: 0.9942
Epoch 20/200
 - 1s - loss: 0.0271 - acc: 0.9916 - val_loss: 0.0250 - val_acc: 0.9944
Epoch 21/200
 - 1s - loss: 0.0264 - acc: 0.9920 - val_loss: 0.0249 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0268 - acc: 0.9922 - val_loss: 0.0248 - val_acc: 0.9944
Epoch 23/200
 - 1s - loss: 0.0270 - acc: 0.9920 - val_loss: 0.0247 - val_acc: 0.9944
Epoch 24/200
 - 1s - loss: 0.0268 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0265 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0267 - acc: 0.9920 - val_loss: 0.0245 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0271 - acc: 0.9914 - val_loss: 0.0244 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0260 - acc: 0.9920 - val_loss: 0.0243 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0266 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0262 - acc: 0.9926 - val_loss: 0.0242 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0254 - acc: 0.9932 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0259 - acc: 0.9920 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0255 - acc: 0.9926 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0251 - acc: 0.9928 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 37/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0261 - acc: 0.9920 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 39/200
 - 1s - loss: 0.0252 - acc: 0.9922 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0257 - acc: 0.9925 - val_loss: 0.0238 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0253 - acc: 0.9924 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0256 - acc: 0.9923 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0254 - acc: 0.9926 - val_loss: 0.0237 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0247 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0248 - acc: 0.9926 - val_loss: 0.0236 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0243 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9946
Epoch 47/200
 - 1s - loss: 0.0252 - acc: 0.9919 - val_loss: 0.0235 - val_acc: 0.9946
Epoch 48/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0235 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0252 - acc: 0.9927 - val_loss: 0.0235 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0250 - acc: 0.9927 - val_loss: 0.0234 - val_acc: 0.9946
Epoch 51/200
 - 1s - loss: 0.0250 - acc: 0.9922 - val_loss: 0.0234 - val_acc: 0.9946
Epoch 52/200
 - 1s - loss: 0.0254 - acc: 0.9926 - val_loss: 0.0234 - val_acc: 0.9946
Epoch 53/200
 - 1s - loss: 0.0238 - acc: 0.9930 - val_loss: 0.0234 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0254 - acc: 0.9922 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 56/200
 - 1s - loss: 0.0246 - acc: 0.9928 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 57/200
 - 1s - loss: 0.0252 - acc: 0.9922 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0250 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 59/200
 - 1s - loss: 0.0251 - acc: 0.9925 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 60/200
 - 1s - loss: 0.0249 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 61/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 62/200
 - 1s - loss: 0.0246 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9946
Epoch 63/200
 - 1s - loss: 0.0242 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 64/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 65/200
 - 1s - loss: 0.0251 - acc: 0.9923 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 67/200
 - 1s - loss: 0.0254 - acc: 0.9918 - val_loss: 0.0231 - val_acc: 0.9946
Epoch 68/200
 - 1s - loss: 0.0245 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0249 - acc: 0.9922 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 70/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0245 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 72/200
 - 1s - loss: 0.0251 - acc: 0.9921 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 73/200
 - 1s - loss: 0.0236 - acc: 0.9930 - val_loss: 0.0230 - val_acc: 0.9946
Epoch 74/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 75/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 76/200
 - 1s - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0237 - acc: 0.9925 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 78/200
 - 1s - loss: 0.0245 - acc: 0.9921 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 81/200
 - 1s - loss: 0.0238 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0245 - acc: 0.9926 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 84/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 85/200
 - 1s - loss: 0.0240 - acc: 0.9924 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 86/200
 - 1s - loss: 0.0245 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 87/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9946
Epoch 88/200
 - 1s - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 89/200
 - 1s - loss: 0.0244 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 90/200
 - 1s - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 91/200
 - 1s - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 92/200
 - 1s - loss: 0.0246 - acc: 0.9925 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 93/200
 - 1s - loss: 0.0244 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 94/200
 - 1s - loss: 0.0242 - acc: 0.9926 - val_loss: 0.0227 - val_acc: 0.9946
Epoch 95/200
 - 1s - loss: 0.0239 - acc: 0.9924 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 96/200
 - 1s - loss: 0.0243 - acc: 0.9923 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 97/200
 - 1s - loss: 0.0241 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 98/200
 - 1s - loss: 0.0238 - acc: 0.9925 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 99/200
 - 1s - loss: 0.0242 - acc: 0.9925 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 100/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 101/200
 - 1s - loss: 0.0240 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 102/200
 - 1s - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 103/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0226 - val_acc: 0.9946
Epoch 104/200
 - 1s - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 105/200
 - 1s - loss: 0.0242 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 106/200
 - 1s - loss: 0.0245 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 107/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 108/200
 - 1s - loss: 0.0243 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 109/200
 - 1s - loss: 0.0238 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 110/200
 - 1s - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 111/200
 - 1s - loss: 0.0243 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9946
Epoch 112/200
 - 1s - loss: 0.0234 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9948
Epoch 113/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9948
Epoch 114/200
 - 1s - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 115/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 116/200
 - 1s - loss: 0.0237 - acc: 0.9923 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 117/200
 - 1s - loss: 0.0245 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 118/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 119/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 120/200
 - 1s - loss: 0.0235 - acc: 0.9935 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 121/200
 - 1s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 122/200
 - 1s - loss: 0.0233 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 123/200
 - 1s - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 124/200
 - 1s - loss: 0.0237 - acc: 0.9934 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 125/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9948
Epoch 126/200
 - 1s - loss: 0.0238 - acc: 0.9928 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 127/200
 - 1s - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 128/200
 - 1s - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 129/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 130/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 131/200
 - 1s - loss: 0.0238 - acc: 0.9927 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 132/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 133/200
 - 1s - loss: 0.0243 - acc: 0.9924 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 134/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 135/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 136/200
 - 1s - loss: 0.0239 - acc: 0.9927 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 137/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 138/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0223 - val_acc: 0.9948
Epoch 139/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 140/200
 - 1s - loss: 0.0233 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 141/200
 - 1s - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 142/200
 - 1s - loss: 0.0239 - acc: 0.9926 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 143/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 144/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 145/200
 - 1s - loss: 0.0235 - acc: 0.9926 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 146/200
 - 1s - loss: 0.0234 - acc: 0.9930 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 147/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 148/200
 - 1s - loss: 0.0239 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 149/200
 - 1s - loss: 0.0237 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 150/200
 - 1s - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 151/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 152/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0222 - val_acc: 0.9948
Epoch 153/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 154/200
 - 1s - loss: 0.0236 - acc: 0.9923 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 155/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 156/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 157/200
 - 1s - loss: 0.0238 - acc: 0.9924 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 158/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 159/200
 - 1s - loss: 0.0231 - acc: 0.9927 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 160/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 161/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 162/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 163/200
 - 1s - loss: 0.0235 - acc: 0.9925 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 164/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 165/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 166/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 167/200
 - 1s - loss: 0.0233 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 168/200
 - 1s - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9948
Epoch 169/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 170/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 171/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 172/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0220 - val_acc: 0.9948
Epoch 173/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 174/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 175/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 176/200
 - 1s - loss: 0.0231 - acc: 0.9930 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 177/200
 - 1s - loss: 0.0234 - acc: 0.9930 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 178/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 179/200
 - 1s - loss: 0.0234 - acc: 0.9923 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 180/200
 - 1s - loss: 0.0238 - acc: 0.9923 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 181/200
 - 1s - loss: 0.0231 - acc: 0.9935 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 182/200
 - 1s - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 183/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 184/200
 - 1s - loss: 0.0232 - acc: 0.9936 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 185/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 186/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 187/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 188/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0220 - val_acc: 0.9950
Epoch 189/200
 - 1s - loss: 0.0230 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 190/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 191/200
 - 1s - loss: 0.0229 - acc: 0.9930 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 192/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 193/200
 - 1s - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 194/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 195/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 196/200
 - 1s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 197/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 198/200
 - 1s - loss: 0.0229 - acc: 0.9925 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 199/200
 - 1s - loss: 0.0235 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9950
Epoch 200/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9950
2018-03-27 09:54:22,451 [INFO] Evaluate...
2018-03-27 09:54:24,698 [INFO] Done!
2018-03-27 09:54:24,704 [INFO] tpe_transform took 0.002435 seconds
2018-03-27 09:54:24,705 [INFO] TPE using 22/22 trials with best loss 0.018868
2018-03-27 09:54:24,712 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:54:25,705 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0669 - acc: 0.9775 - val_loss: 0.0283 - val_acc: 0.9924
Epoch 2/200
 - 1s - loss: 0.0297 - acc: 0.9910 - val_loss: 0.0247 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0248 - acc: 0.9924 - val_loss: 0.0223 - val_acc: 0.9934
Epoch 4/200
 - 1s - loss: 0.0237 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9936
Epoch 5/200
 - 1s - loss: 0.0226 - acc: 0.9932 - val_loss: 0.0208 - val_acc: 0.9938
Epoch 6/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9940
Epoch 7/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0199 - acc: 0.9936 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 9/200
 - 1s - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 10/200
 - 1s - loss: 0.0188 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9948
Epoch 11/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0192 - acc: 0.9929 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 17/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 21/200
 - 1s - loss: 0.0165 - acc: 0.9945 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 22/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 23/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 24/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0160 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 33/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0154 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 37/200
 - 1s - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 38/200
 - 1s - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 39/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 40/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 41/200
 - 1s - loss: 0.0162 - acc: 0.9942 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 42/200
 - 1s - loss: 0.0155 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 44/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 45/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 47/200
 - 1s - loss: 0.0151 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 48/200
 - 1s - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 49/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 50/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 51/200
 - 1s - loss: 0.0142 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 52/200
 - 1s - loss: 0.0157 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 53/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 54/200
 - 1s - loss: 0.0152 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 55/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 56/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 57/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 58/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 59/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 60/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 61/200
 - 1s - loss: 0.0143 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 62/200
 - 1s - loss: 0.0148 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 63/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 64/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 65/200
 - 1s - loss: 0.0146 - acc: 0.9958 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 66/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 67/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 68/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0176 - val_acc: 0.9952
Epoch 69/200
 - 1s - loss: 0.0150 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9954
Epoch 70/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 71/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 72/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 73/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 74/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0175 - val_acc: 0.9954
Epoch 75/200
 - 1s - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0175 - val_acc: 0.9954
Epoch 76/200
 - 1s - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 77/200
 - 1s - loss: 0.0144 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 78/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 79/200
 - 1s - loss: 0.0141 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9954
Epoch 80/200
 - 1s - loss: 0.0142 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 81/200
 - 1s - loss: 0.0155 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 82/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 83/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 84/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 85/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 86/200
 - 1s - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 87/200
 - 1s - loss: 0.0145 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 88/200
 - 1s - loss: 0.0145 - acc: 0.9951 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 89/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 90/200
 - 1s - loss: 0.0149 - acc: 0.9949 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 91/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 92/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9952
Epoch 93/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 94/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 95/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 96/200
 - 1s - loss: 0.0149 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 97/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 98/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 99/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 100/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 101/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 102/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 103/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 104/200
 - 1s - loss: 0.0141 - acc: 0.9959 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 105/200
 - 1s - loss: 0.0141 - acc: 0.9959 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 106/200
 - 1s - loss: 0.0139 - acc: 0.9953 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 107/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 108/200
 - 1s - loss: 0.0142 - acc: 0.9951 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 109/200
 - 1s - loss: 0.0144 - acc: 0.9952 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 110/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 111/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 112/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 113/200
 - 1s - loss: 0.0127 - acc: 0.9963 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 114/200
 - 1s - loss: 0.0146 - acc: 0.9948 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 115/200
 - 1s - loss: 0.0132 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 116/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 117/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 118/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 119/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 120/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 121/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 122/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 123/200
 - 1s - loss: 0.0142 - acc: 0.9948 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 124/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 125/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 126/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 127/200
 - 1s - loss: 0.0145 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 128/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 129/200
 - 1s - loss: 0.0139 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 130/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 131/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 132/200
 - 1s - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 133/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 134/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 135/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 136/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 137/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 138/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 139/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 140/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 141/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 142/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 143/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 144/200
 - 1s - loss: 0.0141 - acc: 0.9951 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 145/200
 - 1s - loss: 0.0134 - acc: 0.9950 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 146/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 147/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 148/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 149/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 150/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 151/200
 - 1s - loss: 0.0132 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 152/200
 - 1s - loss: 0.0139 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 153/200
 - 1s - loss: 0.0145 - acc: 0.9951 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 154/200
 - 1s - loss: 0.0128 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 155/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 156/200
 - 1s - loss: 0.0130 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 157/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 158/200
 - 1s - loss: 0.0130 - acc: 0.9962 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 159/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 160/200
 - 1s - loss: 0.0140 - acc: 0.9951 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 161/200
 - 1s - loss: 0.0133 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 162/200
 - 1s - loss: 0.0130 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 163/200
 - 1s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9952
2018-03-27 09:56:56,241 [INFO] Evaluate...
2018-03-27 09:56:58,547 [INFO] Done!
2018-03-27 09:56:58,554 [INFO] tpe_transform took 0.002457 seconds
2018-03-27 09:56:58,554 [INFO] TPE using 23/23 trials with best loss 0.017177
2018-03-27 09:56:58,562 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 09:56:59,546 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0876 - acc: 0.9723 - val_loss: 0.0320 - val_acc: 0.9916
Epoch 2/200
 - 1s - loss: 0.0352 - acc: 0.9899 - val_loss: 0.0251 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0318 - acc: 0.9895 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 4/200
 - 1s - loss: 0.0278 - acc: 0.9912 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 5/200
 - 1s - loss: 0.0264 - acc: 0.9920 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0245 - acc: 0.9924 - val_loss: 0.0184 - val_acc: 0.9942
Epoch 7/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0180 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 9/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 10/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0164 - val_acc: 0.9948
Epoch 12/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 13/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0159 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 15/200
 - 1s - loss: 0.0192 - acc: 0.9935 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0199 - acc: 0.9936 - val_loss: 0.0155 - val_acc: 0.9946
Epoch 17/200
 - 1s - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0154 - val_acc: 0.9948
Epoch 18/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 19/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0152 - val_acc: 0.9950
Epoch 20/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0151 - val_acc: 0.9950
Epoch 22/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9950
Epoch 23/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0150 - val_acc: 0.9950
Epoch 24/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0149 - val_acc: 0.9950
Epoch 25/200
 - 1s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0149 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0148 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0175 - acc: 0.9940 - val_loss: 0.0147 - val_acc: 0.9950
Epoch 28/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0147 - val_acc: 0.9950
Epoch 29/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0146 - val_acc: 0.9948
Epoch 30/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0146 - val_acc: 0.9946
Epoch 31/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 32/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0146 - val_acc: 0.9952
Epoch 33/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0145 - val_acc: 0.9950
Epoch 34/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0144 - val_acc: 0.9950
Epoch 35/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0144 - val_acc: 0.9950
Epoch 36/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0144 - val_acc: 0.9950
Epoch 37/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 38/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 39/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 40/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 41/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 42/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 43/200
 - 1s - loss: 0.0164 - acc: 0.9945 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 44/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 45/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0141 - val_acc: 0.9952
Epoch 46/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 47/200
 - 1s - loss: 0.0157 - acc: 0.9948 - val_loss: 0.0141 - val_acc: 0.9952
Epoch 48/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0141 - val_acc: 0.9952
Epoch 49/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0142 - val_acc: 0.9954
Epoch 50/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0140 - val_acc: 0.9952
Epoch 51/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0140 - val_acc: 0.9952
Epoch 52/200
 - 1s - loss: 0.0153 - acc: 0.9953 - val_loss: 0.0140 - val_acc: 0.9952
Epoch 53/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0140 - val_acc: 0.9952
Epoch 54/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0139 - val_acc: 0.9952
Epoch 55/200
 - 1s - loss: 0.0153 - acc: 0.9946 - val_loss: 0.0139 - val_acc: 0.9952
Epoch 56/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 57/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 58/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 59/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0138 - val_acc: 0.9952
Epoch 60/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0138 - val_acc: 0.9952
Epoch 61/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0138 - val_acc: 0.9952
Epoch 62/200
 - 1s - loss: 0.0148 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9952
Epoch 63/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0138 - val_acc: 0.9952
Epoch 64/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0138 - val_acc: 0.9952
Epoch 65/200
 - 1s - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 66/200
 - 1s - loss: 0.0152 - acc: 0.9946 - val_loss: 0.0138 - val_acc: 0.9956
Epoch 67/200
 - 1s - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 68/200
 - 1s - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0138 - val_acc: 0.9956
Epoch 69/200
 - 1s - loss: 0.0155 - acc: 0.9949 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 70/200
 - 1s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 71/200
 - 1s - loss: 0.0158 - acc: 0.9948 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 72/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0137 - val_acc: 0.9954
Epoch 73/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0137 - val_acc: 0.9954
Epoch 74/200
 - 1s - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0136 - val_acc: 0.9954
Epoch 75/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 76/200
 - 1s - loss: 0.0152 - acc: 0.9947 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 77/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 78/200
 - 1s - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0136 - val_acc: 0.9952
Epoch 79/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 80/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0136 - val_acc: 0.9952
Epoch 81/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 82/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 83/200
 - 1s - loss: 0.0144 - acc: 0.9961 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 84/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 85/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 86/200
 - 1s - loss: 0.0144 - acc: 0.9960 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 87/200
 - 1s - loss: 0.0152 - acc: 0.9949 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 88/200
 - 1s - loss: 0.0148 - acc: 0.9953 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 89/200
 - 1s - loss: 0.0147 - acc: 0.9950 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 90/200
 - 1s - loss: 0.0145 - acc: 0.9959 - val_loss: 0.0136 - val_acc: 0.9956
Epoch 91/200
 - 1s - loss: 0.0152 - acc: 0.9956 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 92/200
 - 1s - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 93/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 94/200
 - 1s - loss: 0.0141 - acc: 0.9957 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 95/200
 - 1s - loss: 0.0144 - acc: 0.9949 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 96/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 97/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 98/200
 - 1s - loss: 0.0146 - acc: 0.9959 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 99/200
 - 1s - loss: 0.0154 - acc: 0.9947 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 100/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 101/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 102/200
 - 1s - loss: 0.0142 - acc: 0.9951 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 103/200
 - 1s - loss: 0.0143 - acc: 0.9952 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 104/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 105/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 106/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 107/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 108/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 109/200
 - 1s - loss: 0.0146 - acc: 0.9952 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 110/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 111/200
 - 1s - loss: 0.0142 - acc: 0.9953 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 112/200
 - 1s - loss: 0.0138 - acc: 0.9961 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 113/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 114/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 115/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 116/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 117/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 118/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 119/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 120/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 121/200
 - 1s - loss: 0.0148 - acc: 0.9953 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 122/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 123/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 124/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 125/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 126/200
 - 1s - loss: 0.0141 - acc: 0.9952 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 127/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0134 - val_acc: 0.9954
Epoch 128/200
 - 1s - loss: 0.0143 - acc: 0.9957 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 129/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 130/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 131/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 132/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 133/200
 - 1s - loss: 0.0151 - acc: 0.9951 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 134/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 135/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 136/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 137/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 138/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 139/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 140/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 141/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 142/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 143/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 144/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 145/200
 - 1s - loss: 0.0143 - acc: 0.9950 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 146/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 147/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 148/200
 - 1s - loss: 0.0145 - acc: 0.9951 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 149/200
 - 1s - loss: 0.0133 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 150/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 151/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 152/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 153/200
 - 1s - loss: 0.0137 - acc: 0.9961 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 154/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 155/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 156/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 157/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 158/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 159/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 160/200
 - 1s - loss: 0.0140 - acc: 0.9960 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 161/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 162/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 163/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 164/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 165/200
 - 1s - loss: 0.0135 - acc: 0.9954 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 166/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 167/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 168/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 169/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 170/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 171/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 172/200
 - 1s - loss: 0.0142 - acc: 0.9953 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 173/200
 - 1s - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 174/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 175/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 176/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 177/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 178/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 179/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 180/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 181/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 182/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0132 - val_acc: 0.9952
Epoch 183/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 184/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 185/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9952
Epoch 186/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0132 - val_acc: 0.9952
Epoch 187/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9952
Epoch 188/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0132 - val_acc: 0.9952
Epoch 189/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 190/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 191/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 192/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 193/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 194/200
 - 1s - loss: 0.0131 - acc: 0.9957 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 195/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 196/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 197/200
 - 1s - loss: 0.0132 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 198/200
 - 1s - loss: 0.0129 - acc: 0.9963 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 199/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0132 - val_acc: 0.9952
Epoch 200/200
 - 1s - loss: 0.0127 - acc: 0.9963 - val_loss: 0.0132 - val_acc: 0.9952
2018-03-27 10:00:03,135 [INFO] Evaluate...
2018-03-27 10:00:05,586 [INFO] Done!
2018-03-27 10:00:05,593 [INFO] tpe_transform took 0.002540 seconds
2018-03-27 10:00:05,593 [INFO] TPE using 24/24 trials with best loss 0.013169
2018-03-27 10:00:05,601 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:00:06,605 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.1171 - acc: 0.9628 - val_loss: 0.0500 - val_acc: 0.9874
Epoch 2/200
 - 1s - loss: 0.0556 - acc: 0.9851 - val_loss: 0.0407 - val_acc: 0.9908
Epoch 3/200
 - 1s - loss: 0.0481 - acc: 0.9870 - val_loss: 0.0369 - val_acc: 0.9916
Epoch 4/200
 - 1s - loss: 0.0442 - acc: 0.9879 - val_loss: 0.0349 - val_acc: 0.9912
Epoch 5/200
 - 1s - loss: 0.0425 - acc: 0.9879 - val_loss: 0.0333 - val_acc: 0.9922
Epoch 6/200
 - 1s - loss: 0.0416 - acc: 0.9890 - val_loss: 0.0322 - val_acc: 0.9922
Epoch 7/200
 - 1s - loss: 0.0399 - acc: 0.9884 - val_loss: 0.0315 - val_acc: 0.9924
Epoch 8/200
 - 1s - loss: 0.0396 - acc: 0.9884 - val_loss: 0.0307 - val_acc: 0.9926
Epoch 9/200
 - 1s - loss: 0.0386 - acc: 0.9901 - val_loss: 0.0302 - val_acc: 0.9924
Epoch 10/200
 - 1s - loss: 0.0380 - acc: 0.9890 - val_loss: 0.0298 - val_acc: 0.9926
Epoch 11/200
 - 1s - loss: 0.0369 - acc: 0.9897 - val_loss: 0.0294 - val_acc: 0.9930
Epoch 12/200
 - 1s - loss: 0.0367 - acc: 0.9897 - val_loss: 0.0291 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0362 - acc: 0.9891 - val_loss: 0.0288 - val_acc: 0.9930
Epoch 14/200
 - 1s - loss: 0.0361 - acc: 0.9892 - val_loss: 0.0285 - val_acc: 0.9932
Epoch 15/200
 - 1s - loss: 0.0356 - acc: 0.9900 - val_loss: 0.0283 - val_acc: 0.9932
Epoch 16/200
 - 1s - loss: 0.0355 - acc: 0.9891 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 17/200
 - 1s - loss: 0.0362 - acc: 0.9897 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 18/200
 - 1s - loss: 0.0363 - acc: 0.9899 - val_loss: 0.0276 - val_acc: 0.9934
Epoch 19/200
 - 1s - loss: 0.0360 - acc: 0.9891 - val_loss: 0.0274 - val_acc: 0.9936
Epoch 20/200
 - 1s - loss: 0.0354 - acc: 0.9895 - val_loss: 0.0273 - val_acc: 0.9936
Epoch 21/200
 - 1s - loss: 0.0359 - acc: 0.9891 - val_loss: 0.0271 - val_acc: 0.9938
Epoch 22/200
 - 1s - loss: 0.0347 - acc: 0.9901 - val_loss: 0.0269 - val_acc: 0.9938
Epoch 23/200
 - 1s - loss: 0.0346 - acc: 0.9909 - val_loss: 0.0268 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0342 - acc: 0.9901 - val_loss: 0.0267 - val_acc: 0.9936
Epoch 25/200
 - 1s - loss: 0.0336 - acc: 0.9897 - val_loss: 0.0266 - val_acc: 0.9936
Epoch 26/200
 - 1s - loss: 0.0341 - acc: 0.9902 - val_loss: 0.0265 - val_acc: 0.9938
Epoch 27/200
 - 1s - loss: 0.0333 - acc: 0.9905 - val_loss: 0.0264 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0350 - acc: 0.9893 - val_loss: 0.0263 - val_acc: 0.9938
Epoch 29/200
 - 1s - loss: 0.0336 - acc: 0.9905 - val_loss: 0.0262 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0339 - acc: 0.9904 - val_loss: 0.0261 - val_acc: 0.9940
Epoch 31/200
 - 1s - loss: 0.0332 - acc: 0.9906 - val_loss: 0.0261 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0326 - acc: 0.9909 - val_loss: 0.0260 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0326 - acc: 0.9906 - val_loss: 0.0259 - val_acc: 0.9940
Epoch 34/200
 - 1s - loss: 0.0330 - acc: 0.9896 - val_loss: 0.0258 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0325 - acc: 0.9912 - val_loss: 0.0258 - val_acc: 0.9938
Epoch 36/200
 - 1s - loss: 0.0340 - acc: 0.9897 - val_loss: 0.0257 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0325 - acc: 0.9903 - val_loss: 0.0256 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0324 - acc: 0.9905 - val_loss: 0.0255 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0329 - acc: 0.9899 - val_loss: 0.0255 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0322 - acc: 0.9908 - val_loss: 0.0254 - val_acc: 0.9938
Epoch 41/200
 - 1s - loss: 0.0320 - acc: 0.9904 - val_loss: 0.0254 - val_acc: 0.9938
Epoch 42/200
 - 1s - loss: 0.0326 - acc: 0.9904 - val_loss: 0.0253 - val_acc: 0.9938
Epoch 43/200
 - 1s - loss: 0.0330 - acc: 0.9904 - val_loss: 0.0253 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0326 - acc: 0.9899 - val_loss: 0.0252 - val_acc: 0.9940
Epoch 45/200
 - 1s - loss: 0.0336 - acc: 0.9896 - val_loss: 0.0252 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0317 - acc: 0.9904 - val_loss: 0.0251 - val_acc: 0.9940
Epoch 47/200
 - 1s - loss: 0.0323 - acc: 0.9905 - val_loss: 0.0251 - val_acc: 0.9940
Epoch 48/200
 - 1s - loss: 0.0323 - acc: 0.9905 - val_loss: 0.0250 - val_acc: 0.9940
Epoch 49/200
 - 1s - loss: 0.0329 - acc: 0.9909 - val_loss: 0.0250 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0326 - acc: 0.9898 - val_loss: 0.0250 - val_acc: 0.9940
Epoch 51/200
 - 1s - loss: 0.0322 - acc: 0.9902 - val_loss: 0.0249 - val_acc: 0.9940
Epoch 52/200
 - 1s - loss: 0.0313 - acc: 0.9911 - val_loss: 0.0249 - val_acc: 0.9940
Epoch 53/200
 - 1s - loss: 0.0311 - acc: 0.9908 - val_loss: 0.0248 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0323 - acc: 0.9898 - val_loss: 0.0248 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0314 - acc: 0.9905 - val_loss: 0.0247 - val_acc: 0.9940
Epoch 56/200
 - 1s - loss: 0.0311 - acc: 0.9906 - val_loss: 0.0247 - val_acc: 0.9940
Epoch 57/200
 - 1s - loss: 0.0311 - acc: 0.9909 - val_loss: 0.0247 - val_acc: 0.9940
Epoch 58/200
 - 1s - loss: 0.0329 - acc: 0.9906 - val_loss: 0.0246 - val_acc: 0.9940
Epoch 59/200
 - 1s - loss: 0.0313 - acc: 0.9905 - val_loss: 0.0246 - val_acc: 0.9940
Epoch 60/200
 - 1s - loss: 0.0310 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9940
Epoch 61/200
 - 1s - loss: 0.0313 - acc: 0.9905 - val_loss: 0.0245 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0310 - acc: 0.9904 - val_loss: 0.0245 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0320 - acc: 0.9905 - val_loss: 0.0245 - val_acc: 0.9940
Epoch 64/200
 - 1s - loss: 0.0315 - acc: 0.9911 - val_loss: 0.0244 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0320 - acc: 0.9904 - val_loss: 0.0244 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0301 - acc: 0.9910 - val_loss: 0.0244 - val_acc: 0.9940
Epoch 67/200
 - 1s - loss: 0.0315 - acc: 0.9906 - val_loss: 0.0243 - val_acc: 0.9940
Epoch 68/200
 - 1s - loss: 0.0311 - acc: 0.9906 - val_loss: 0.0243 - val_acc: 0.9940
Epoch 69/200
 - 1s - loss: 0.0317 - acc: 0.9912 - val_loss: 0.0243 - val_acc: 0.9940
Epoch 70/200
 - 1s - loss: 0.0315 - acc: 0.9906 - val_loss: 0.0243 - val_acc: 0.9940
Epoch 71/200
 - 1s - loss: 0.0305 - acc: 0.9909 - val_loss: 0.0242 - val_acc: 0.9940
Epoch 72/200
 - 1s - loss: 0.0318 - acc: 0.9907 - val_loss: 0.0242 - val_acc: 0.9940
Epoch 73/200
 - 1s - loss: 0.0303 - acc: 0.9914 - val_loss: 0.0242 - val_acc: 0.9940
Epoch 74/200
 - 1s - loss: 0.0310 - acc: 0.9909 - val_loss: 0.0242 - val_acc: 0.9940
Epoch 75/200
 - 1s - loss: 0.0304 - acc: 0.9912 - val_loss: 0.0241 - val_acc: 0.9940
Epoch 76/200
 - 1s - loss: 0.0316 - acc: 0.9905 - val_loss: 0.0241 - val_acc: 0.9940
Epoch 77/200
 - 1s - loss: 0.0308 - acc: 0.9909 - val_loss: 0.0241 - val_acc: 0.9940
Epoch 78/200
 - 1s - loss: 0.0302 - acc: 0.9913 - val_loss: 0.0241 - val_acc: 0.9940
Epoch 79/200
 - 1s - loss: 0.0314 - acc: 0.9903 - val_loss: 0.0241 - val_acc: 0.9940
Epoch 80/200
 - 1s - loss: 0.0312 - acc: 0.9909 - val_loss: 0.0240 - val_acc: 0.9940
Epoch 81/200
 - 1s - loss: 0.0304 - acc: 0.9905 - val_loss: 0.0240 - val_acc: 0.9940
Epoch 82/200
 - 1s - loss: 0.0308 - acc: 0.9901 - val_loss: 0.0240 - val_acc: 0.9940
Epoch 83/200
 - 1s - loss: 0.0317 - acc: 0.9906 - val_loss: 0.0240 - val_acc: 0.9940
Epoch 84/200
 - 1s - loss: 0.0310 - acc: 0.9901 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 85/200
 - 1s - loss: 0.0296 - acc: 0.9920 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 86/200
 - 1s - loss: 0.0307 - acc: 0.9906 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 87/200
 - 1s - loss: 0.0301 - acc: 0.9916 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 88/200
 - 1s - loss: 0.0307 - acc: 0.9913 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 89/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0238 - val_acc: 0.9940
Epoch 90/200
 - 1s - loss: 0.0308 - acc: 0.9905 - val_loss: 0.0238 - val_acc: 0.9940
Epoch 91/200
 - 1s - loss: 0.0297 - acc: 0.9910 - val_loss: 0.0238 - val_acc: 0.9940
Epoch 92/200
 - 1s - loss: 0.0303 - acc: 0.9907 - val_loss: 0.0238 - val_acc: 0.9940
Epoch 93/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0238 - val_acc: 0.9940
Epoch 94/200
 - 1s - loss: 0.0304 - acc: 0.9907 - val_loss: 0.0237 - val_acc: 0.9940
Epoch 95/200
 - 1s - loss: 0.0304 - acc: 0.9906 - val_loss: 0.0237 - val_acc: 0.9940
Epoch 96/200
 - 1s - loss: 0.0299 - acc: 0.9909 - val_loss: 0.0237 - val_acc: 0.9940
Epoch 97/200
 - 1s - loss: 0.0303 - acc: 0.9901 - val_loss: 0.0237 - val_acc: 0.9940
Epoch 98/200
 - 1s - loss: 0.0303 - acc: 0.9913 - val_loss: 0.0237 - val_acc: 0.9940
Epoch 99/200
 - 1s - loss: 0.0307 - acc: 0.9914 - val_loss: 0.0237 - val_acc: 0.9940
Epoch 100/200
 - 1s - loss: 0.0294 - acc: 0.9912 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 101/200
 - 1s - loss: 0.0309 - acc: 0.9902 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 102/200
 - 1s - loss: 0.0299 - acc: 0.9909 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 103/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 104/200
 - 1s - loss: 0.0299 - acc: 0.9907 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 105/200
 - 1s - loss: 0.0299 - acc: 0.9910 - val_loss: 0.0236 - val_acc: 0.9940
Epoch 106/200
 - 1s - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 107/200
 - 1s - loss: 0.0312 - acc: 0.9897 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 108/200
 - 1s - loss: 0.0289 - acc: 0.9913 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 109/200
 - 1s - loss: 0.0302 - acc: 0.9913 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 110/200
 - 1s - loss: 0.0303 - acc: 0.9908 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 111/200
 - 1s - loss: 0.0300 - acc: 0.9912 - val_loss: 0.0235 - val_acc: 0.9940
Epoch 112/200
 - 1s - loss: 0.0306 - acc: 0.9903 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 113/200
 - 1s - loss: 0.0299 - acc: 0.9909 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 114/200
 - 1s - loss: 0.0302 - acc: 0.9909 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 115/200
 - 1s - loss: 0.0303 - acc: 0.9909 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 116/200
 - 1s - loss: 0.0298 - acc: 0.9910 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 117/200
 - 1s - loss: 0.0305 - acc: 0.9905 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 118/200
 - 1s - loss: 0.0312 - acc: 0.9905 - val_loss: 0.0234 - val_acc: 0.9940
Epoch 119/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 120/200
 - 1s - loss: 0.0301 - acc: 0.9905 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 121/200
 - 1s - loss: 0.0302 - acc: 0.9909 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 122/200
 - 1s - loss: 0.0299 - acc: 0.9909 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 123/200
 - 1s - loss: 0.0310 - acc: 0.9899 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 124/200
 - 1s - loss: 0.0303 - acc: 0.9908 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 125/200
 - 1s - loss: 0.0297 - acc: 0.9911 - val_loss: 0.0233 - val_acc: 0.9940
Epoch 126/200
 - 1s - loss: 0.0296 - acc: 0.9917 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 127/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 128/200
 - 1s - loss: 0.0294 - acc: 0.9909 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 129/200
 - 1s - loss: 0.0296 - acc: 0.9912 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 130/200
 - 1s - loss: 0.0303 - acc: 0.9905 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 131/200
 - 1s - loss: 0.0299 - acc: 0.9905 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 132/200
 - 1s - loss: 0.0305 - acc: 0.9905 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 133/200
 - 1s - loss: 0.0306 - acc: 0.9907 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 134/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0232 - val_acc: 0.9940
Epoch 135/200
 - 1s - loss: 0.0301 - acc: 0.9913 - val_loss: 0.0231 - val_acc: 0.9940
Epoch 136/200
 - 1s - loss: 0.0306 - acc: 0.9904 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 137/200
 - 1s - loss: 0.0296 - acc: 0.9910 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 138/200
 - 1s - loss: 0.0296 - acc: 0.9918 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 139/200
 - 1s - loss: 0.0296 - acc: 0.9904 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 140/200
 - 1s - loss: 0.0295 - acc: 0.9911 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0294 - acc: 0.9914 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 142/200
 - 1s - loss: 0.0287 - acc: 0.9919 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0299 - acc: 0.9904 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0299 - acc: 0.9908 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 145/200
 - 1s - loss: 0.0299 - acc: 0.9906 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0300 - acc: 0.9901 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 147/200
 - 1s - loss: 0.0288 - acc: 0.9913 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 148/200
 - 1s - loss: 0.0304 - acc: 0.9906 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 149/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 150/200
 - 1s - loss: 0.0301 - acc: 0.9906 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 151/200
 - 1s - loss: 0.0294 - acc: 0.9911 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 152/200
 - 1s - loss: 0.0293 - acc: 0.9915 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 153/200
 - 1s - loss: 0.0291 - acc: 0.9915 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 154/200
 - 1s - loss: 0.0292 - acc: 0.9909 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 155/200
 - 1s - loss: 0.0293 - acc: 0.9913 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 156/200
 - 1s - loss: 0.0293 - acc: 0.9915 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 157/200
 - 1s - loss: 0.0299 - acc: 0.9913 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 158/200
 - 1s - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 159/200
 - 1s - loss: 0.0299 - acc: 0.9913 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 160/200
 - 1s - loss: 0.0292 - acc: 0.9913 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 161/200
 - 1s - loss: 0.0288 - acc: 0.9907 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 162/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 163/200
 - 1s - loss: 0.0288 - acc: 0.9918 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 164/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 165/200
 - 1s - loss: 0.0292 - acc: 0.9905 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 166/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 167/200
 - 1s - loss: 0.0295 - acc: 0.9916 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 168/200
 - 1s - loss: 0.0295 - acc: 0.9912 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 169/200
 - 1s - loss: 0.0291 - acc: 0.9911 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 170/200
 - 1s - loss: 0.0299 - acc: 0.9902 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 171/200
 - 1s - loss: 0.0304 - acc: 0.9908 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 172/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 173/200
 - 1s - loss: 0.0302 - acc: 0.9903 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 174/200
 - 1s - loss: 0.0296 - acc: 0.9914 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 175/200
 - 1s - loss: 0.0300 - acc: 0.9912 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0292 - acc: 0.9918 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 177/200
 - 1s - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 178/200
 - 1s - loss: 0.0290 - acc: 0.9913 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 179/200
 - 1s - loss: 0.0286 - acc: 0.9913 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 180/200
 - 1s - loss: 0.0283 - acc: 0.9913 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 181/200
 - 1s - loss: 0.0284 - acc: 0.9914 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 182/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 183/200
 - 1s - loss: 0.0294 - acc: 0.9912 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 184/200
 - 1s - loss: 0.0292 - acc: 0.9904 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 185/200
 - 1s - loss: 0.0298 - acc: 0.9909 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 186/200
 - 1s - loss: 0.0278 - acc: 0.9922 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 187/200
 - 1s - loss: 0.0291 - acc: 0.9911 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 188/200
 - 1s - loss: 0.0298 - acc: 0.9906 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 189/200
 - 1s - loss: 0.0295 - acc: 0.9913 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 191/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 192/200
 - 1s - loss: 0.0308 - acc: 0.9909 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 193/200
 - 1s - loss: 0.0281 - acc: 0.9919 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 194/200
 - 1s - loss: 0.0300 - acc: 0.9909 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 195/200
 - 1s - loss: 0.0278 - acc: 0.9915 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 196/200
 - 1s - loss: 0.0294 - acc: 0.9911 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 197/200
 - 1s - loss: 0.0301 - acc: 0.9906 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 198/200
 - 1s - loss: 0.0299 - acc: 0.9906 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 199/200
 - 1s - loss: 0.0298 - acc: 0.9910 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 200/200
 - 1s - loss: 0.0295 - acc: 0.9911 - val_loss: 0.0226 - val_acc: 0.9938
2018-03-27 10:03:10,820 [INFO] Evaluate...
2018-03-27 10:03:13,228 [INFO] Done!
2018-03-27 10:03:13,235 [INFO] tpe_transform took 0.002457 seconds
2018-03-27 10:03:13,235 [INFO] TPE using 25/25 trials with best loss 0.013169
2018-03-27 10:03:13,242 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:03:14,229 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0856 - acc: 0.9723 - val_loss: 0.0336 - val_acc: 0.9906
Epoch 2/200
 - 1s - loss: 0.0346 - acc: 0.9900 - val_loss: 0.0279 - val_acc: 0.9912
Epoch 3/200
 - 1s - loss: 0.0301 - acc: 0.9907 - val_loss: 0.0249 - val_acc: 0.9912
Epoch 4/200
 - 1s - loss: 0.0260 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9916
Epoch 5/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0232 - acc: 0.9936 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 7/200
 - 1s - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9926
Epoch 8/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 9/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 10/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 11/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9936
Epoch 12/200
 - 1s - loss: 0.0198 - acc: 0.9947 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 13/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9932
Epoch 14/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9936
Epoch 15/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 16/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0196 - val_acc: 0.9928
Epoch 17/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 18/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0193 - val_acc: 0.9928
Epoch 19/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 20/200
 - 1s - loss: 0.0183 - acc: 0.9939 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 21/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 22/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0189 - val_acc: 0.9928
Epoch 23/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0189 - val_acc: 0.9936
Epoch 24/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 25/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0190 - val_acc: 0.9938
Epoch 26/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0189 - val_acc: 0.9936
Epoch 27/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9936
Epoch 28/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0191 - val_acc: 0.9936
Epoch 29/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9936
Epoch 30/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9932
Epoch 31/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0190 - val_acc: 0.9934
Epoch 32/200
 - 1s - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0184 - val_acc: 0.9932
Epoch 33/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0186 - val_acc: 0.9936
Epoch 34/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0186 - val_acc: 0.9936
Epoch 35/200
 - 1s - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0183 - val_acc: 0.9930
Epoch 36/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9936
Epoch 37/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9930
Epoch 38/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 39/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 40/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 41/200
 - 1s - loss: 0.0155 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 42/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 43/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0183 - val_acc: 0.9934
Epoch 44/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0180 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9934
Epoch 46/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 47/200
 - 1s - loss: 0.0158 - acc: 0.9948 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 48/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 49/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 50/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 51/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 52/200
 - 1s - loss: 0.0149 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9928
Epoch 53/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 54/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 55/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 56/200
 - 1s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 57/200
 - 1s - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 59/200
 - 1s - loss: 0.0146 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 60/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9930
Epoch 61/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 62/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0178 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0179 - val_acc: 0.9932
Epoch 64/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0179 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0143 - acc: 0.9957 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 67/200
 - 1s - loss: 0.0148 - acc: 0.9957 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 68/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0179 - val_acc: 0.9934
2018-03-27 10:04:22,576 [INFO] Evaluate...
2018-03-27 10:04:24,992 [INFO] Done!
2018-03-27 10:04:24,999 [INFO] tpe_transform took 0.002493 seconds
2018-03-27 10:04:24,999 [INFO] TPE using 26/26 trials with best loss 0.013169
2018-03-27 10:04:25,006 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:04:25,993 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0666 - acc: 0.9776 - val_loss: 0.0273 - val_acc: 0.9934
Epoch 2/200
 - 1s - loss: 0.0297 - acc: 0.9908 - val_loss: 0.0231 - val_acc: 0.9940
Epoch 3/200
 - 1s - loss: 0.0249 - acc: 0.9921 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 4/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 5/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 6/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 7/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 8/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0184 - val_acc: 0.9950
Epoch 9/200
 - 1s - loss: 0.0192 - acc: 0.9936 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 10/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9952
Epoch 11/200
 - 1s - loss: 0.0168 - acc: 0.9940 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 12/200
 - 1s - loss: 0.0162 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 13/200
 - 1s - loss: 0.0163 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 14/200
 - 1s - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 15/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 16/200
 - 1s - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 17/200
 - 1s - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9954
Epoch 18/200
 - 1s - loss: 0.0147 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9958
Epoch 19/200
 - 1s - loss: 0.0145 - acc: 0.9952 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 20/200
 - 1s - loss: 0.0150 - acc: 0.9949 - val_loss: 0.0171 - val_acc: 0.9958
Epoch 21/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 22/200
 - 1s - loss: 0.0146 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 23/200
 - 1s - loss: 0.0140 - acc: 0.9952 - val_loss: 0.0169 - val_acc: 0.9954
Epoch 24/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 25/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9954
Epoch 26/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 27/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 28/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 29/200
 - 1s - loss: 0.0133 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9952
Epoch 30/200
 - 1s - loss: 0.0143 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 31/200
 - 1s - loss: 0.0140 - acc: 0.9954 - val_loss: 0.0166 - val_acc: 0.9958
Epoch 32/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 33/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 34/200
 - 1s - loss: 0.0134 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 35/200
 - 1s - loss: 0.0128 - acc: 0.9956 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 36/200
 - 1s - loss: 0.0133 - acc: 0.9953 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 37/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 38/200
 - 1s - loss: 0.0130 - acc: 0.9957 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 39/200
 - 1s - loss: 0.0126 - acc: 0.9958 - val_loss: 0.0166 - val_acc: 0.9956
2018-03-27 10:05:09,527 [INFO] Evaluate...
2018-03-27 10:05:12,019 [INFO] Done!
2018-03-27 10:05:12,025 [INFO] tpe_transform took 0.002556 seconds
2018-03-27 10:05:12,026 [INFO] TPE using 27/27 trials with best loss 0.013169
2018-03-27 10:05:12,034 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:05:13,027 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.1042 - acc: 0.9680 - val_loss: 0.0472 - val_acc: 0.9902
Epoch 2/200
 - 1s - loss: 0.0519 - acc: 0.9871 - val_loss: 0.0404 - val_acc: 0.9906
Epoch 3/200
 - 1s - loss: 0.0476 - acc: 0.9868 - val_loss: 0.0376 - val_acc: 0.9908
Epoch 4/200
 - 1s - loss: 0.0436 - acc: 0.9884 - val_loss: 0.0360 - val_acc: 0.9910
Epoch 5/200
 - 1s - loss: 0.0430 - acc: 0.9877 - val_loss: 0.0349 - val_acc: 0.9914
Epoch 6/200
 - 1s - loss: 0.0410 - acc: 0.9889 - val_loss: 0.0341 - val_acc: 0.9916
Epoch 7/200
 - 1s - loss: 0.0397 - acc: 0.9888 - val_loss: 0.0335 - val_acc: 0.9916
Epoch 8/200
 - 1s - loss: 0.0397 - acc: 0.9882 - val_loss: 0.0330 - val_acc: 0.9916
Epoch 9/200
 - 1s - loss: 0.0378 - acc: 0.9895 - val_loss: 0.0326 - val_acc: 0.9916
Epoch 10/200
 - 1s - loss: 0.0375 - acc: 0.9896 - val_loss: 0.0322 - val_acc: 0.9916
Epoch 11/200
 - 1s - loss: 0.0368 - acc: 0.9895 - val_loss: 0.0319 - val_acc: 0.9918
Epoch 12/200
 - 1s - loss: 0.0376 - acc: 0.9893 - val_loss: 0.0316 - val_acc: 0.9920
Epoch 13/200
 - 1s - loss: 0.0370 - acc: 0.9892 - val_loss: 0.0314 - val_acc: 0.9922
Epoch 14/200
 - 1s - loss: 0.0359 - acc: 0.9905 - val_loss: 0.0312 - val_acc: 0.9922
Epoch 15/200
 - 1s - loss: 0.0362 - acc: 0.9898 - val_loss: 0.0310 - val_acc: 0.9922
Epoch 16/200
 - 1s - loss: 0.0358 - acc: 0.9899 - val_loss: 0.0308 - val_acc: 0.9922
Epoch 17/200
 - 1s - loss: 0.0352 - acc: 0.9904 - val_loss: 0.0307 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0356 - acc: 0.9901 - val_loss: 0.0305 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0364 - acc: 0.9891 - val_loss: 0.0304 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0361 - acc: 0.9898 - val_loss: 0.0302 - val_acc: 0.9918
Epoch 21/200
 - 1s - loss: 0.0362 - acc: 0.9893 - val_loss: 0.0301 - val_acc: 0.9918
Epoch 22/200
 - 1s - loss: 0.0346 - acc: 0.9899 - val_loss: 0.0300 - val_acc: 0.9918
Epoch 23/200
 - 1s - loss: 0.0345 - acc: 0.9898 - val_loss: 0.0299 - val_acc: 0.9918
Epoch 24/200
 - 1s - loss: 0.0336 - acc: 0.9905 - val_loss: 0.0298 - val_acc: 0.9918
Epoch 25/200
 - 1s - loss: 0.0345 - acc: 0.9904 - val_loss: 0.0297 - val_acc: 0.9918
Epoch 26/200
 - 1s - loss: 0.0343 - acc: 0.9902 - val_loss: 0.0297 - val_acc: 0.9918
Epoch 27/200
 - 1s - loss: 0.0341 - acc: 0.9896 - val_loss: 0.0296 - val_acc: 0.9918
Epoch 28/200
 - 1s - loss: 0.0331 - acc: 0.9908 - val_loss: 0.0295 - val_acc: 0.9918
Epoch 29/200
 - 1s - loss: 0.0333 - acc: 0.9901 - val_loss: 0.0294 - val_acc: 0.9918
Epoch 30/200
 - 1s - loss: 0.0339 - acc: 0.9905 - val_loss: 0.0294 - val_acc: 0.9918
Epoch 31/200
 - 1s - loss: 0.0333 - acc: 0.9905 - val_loss: 0.0293 - val_acc: 0.9918
Epoch 32/200
 - 1s - loss: 0.0329 - acc: 0.9901 - val_loss: 0.0292 - val_acc: 0.9918
Epoch 33/200
 - 1s - loss: 0.0347 - acc: 0.9896 - val_loss: 0.0292 - val_acc: 0.9918
Epoch 34/200
 - 1s - loss: 0.0348 - acc: 0.9896 - val_loss: 0.0291 - val_acc: 0.9920
Epoch 35/200
 - 1s - loss: 0.0329 - acc: 0.9906 - val_loss: 0.0290 - val_acc: 0.9920
Epoch 36/200
 - 1s - loss: 0.0318 - acc: 0.9911 - val_loss: 0.0290 - val_acc: 0.9920
Epoch 37/200
 - 1s - loss: 0.0330 - acc: 0.9907 - val_loss: 0.0289 - val_acc: 0.9920
Epoch 38/200
 - 1s - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 39/200
 - 1s - loss: 0.0333 - acc: 0.9907 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 40/200
 - 1s - loss: 0.0327 - acc: 0.9906 - val_loss: 0.0288 - val_acc: 0.9918
Epoch 41/200
 - 1s - loss: 0.0335 - acc: 0.9898 - val_loss: 0.0288 - val_acc: 0.9920
Epoch 42/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0287 - val_acc: 0.9920
Epoch 43/200
 - 1s - loss: 0.0328 - acc: 0.9912 - val_loss: 0.0287 - val_acc: 0.9920
Epoch 44/200
 - 1s - loss: 0.0314 - acc: 0.9915 - val_loss: 0.0286 - val_acc: 0.9920
Epoch 45/200
 - 1s - loss: 0.0335 - acc: 0.9906 - val_loss: 0.0286 - val_acc: 0.9920
Epoch 46/200
 - 1s - loss: 0.0331 - acc: 0.9899 - val_loss: 0.0285 - val_acc: 0.9918
Epoch 47/200
 - 1s - loss: 0.0319 - acc: 0.9903 - val_loss: 0.0285 - val_acc: 0.9918
Epoch 48/200
 - 1s - loss: 0.0323 - acc: 0.9908 - val_loss: 0.0285 - val_acc: 0.9918
Epoch 49/200
 - 1s - loss: 0.0320 - acc: 0.9908 - val_loss: 0.0284 - val_acc: 0.9918
Epoch 50/200
 - 1s - loss: 0.0320 - acc: 0.9910 - val_loss: 0.0284 - val_acc: 0.9918
Epoch 51/200
 - 1s - loss: 0.0331 - acc: 0.9901 - val_loss: 0.0284 - val_acc: 0.9918
Epoch 52/200
 - 1s - loss: 0.0324 - acc: 0.9904 - val_loss: 0.0283 - val_acc: 0.9918
Epoch 53/200
 - 1s - loss: 0.0312 - acc: 0.9915 - val_loss: 0.0283 - val_acc: 0.9918
Epoch 54/200
 - 1s - loss: 0.0318 - acc: 0.9909 - val_loss: 0.0283 - val_acc: 0.9918
Epoch 55/200
 - 1s - loss: 0.0312 - acc: 0.9911 - val_loss: 0.0283 - val_acc: 0.9918
Epoch 56/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 57/200
 - 1s - loss: 0.0320 - acc: 0.9907 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 58/200
 - 1s - loss: 0.0313 - acc: 0.9910 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 59/200
 - 1s - loss: 0.0322 - acc: 0.9904 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 60/200
 - 1s - loss: 0.0320 - acc: 0.9909 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 61/200
 - 1s - loss: 0.0306 - acc: 0.9916 - val_loss: 0.0281 - val_acc: 0.9920
Epoch 62/200
 - 1s - loss: 0.0322 - acc: 0.9913 - val_loss: 0.0281 - val_acc: 0.9918
Epoch 63/200
 - 1s - loss: 0.0321 - acc: 0.9907 - val_loss: 0.0280 - val_acc: 0.9918
Epoch 64/200
 - 1s - loss: 0.0316 - acc: 0.9911 - val_loss: 0.0280 - val_acc: 0.9918
Epoch 65/200
 - 1s - loss: 0.0320 - acc: 0.9902 - val_loss: 0.0280 - val_acc: 0.9918
Epoch 66/200
 - 1s - loss: 0.0321 - acc: 0.9903 - val_loss: 0.0280 - val_acc: 0.9918
Epoch 67/200
 - 1s - loss: 0.0316 - acc: 0.9901 - val_loss: 0.0279 - val_acc: 0.9918
Epoch 68/200
 - 1s - loss: 0.0310 - acc: 0.9906 - val_loss: 0.0279 - val_acc: 0.9918
Epoch 69/200
 - 1s - loss: 0.0328 - acc: 0.9894 - val_loss: 0.0279 - val_acc: 0.9918
Epoch 70/200
 - 1s - loss: 0.0330 - acc: 0.9896 - val_loss: 0.0279 - val_acc: 0.9918
Epoch 71/200
 - 1s - loss: 0.0314 - acc: 0.9915 - val_loss: 0.0279 - val_acc: 0.9918
Epoch 72/200
 - 1s - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0278 - val_acc: 0.9918
Epoch 73/200
 - 1s - loss: 0.0314 - acc: 0.9905 - val_loss: 0.0278 - val_acc: 0.9918
Epoch 74/200
 - 1s - loss: 0.0316 - acc: 0.9911 - val_loss: 0.0278 - val_acc: 0.9918
Epoch 75/200
 - 1s - loss: 0.0315 - acc: 0.9903 - val_loss: 0.0278 - val_acc: 0.9918
Epoch 76/200
 - 1s - loss: 0.0309 - acc: 0.9908 - val_loss: 0.0278 - val_acc: 0.9918
Epoch 77/200
 - 1s - loss: 0.0317 - acc: 0.9910 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 78/200
 - 1s - loss: 0.0311 - acc: 0.9907 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 79/200
 - 1s - loss: 0.0312 - acc: 0.9908 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 80/200
 - 1s - loss: 0.0313 - acc: 0.9913 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 81/200
 - 1s - loss: 0.0308 - acc: 0.9907 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 82/200
 - 1s - loss: 0.0309 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 83/200
 - 1s - loss: 0.0308 - acc: 0.9910 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 84/200
 - 1s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 85/200
 - 1s - loss: 0.0303 - acc: 0.9911 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 86/200
 - 1s - loss: 0.0295 - acc: 0.9918 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 87/200
 - 1s - loss: 0.0312 - acc: 0.9905 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 88/200
 - 1s - loss: 0.0310 - acc: 0.9904 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 89/200
 - 1s - loss: 0.0309 - acc: 0.9910 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 90/200
 - 1s - loss: 0.0310 - acc: 0.9908 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 91/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 92/200
 - 1s - loss: 0.0306 - acc: 0.9919 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 93/200
 - 1s - loss: 0.0310 - acc: 0.9910 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 94/200
 - 1s - loss: 0.0309 - acc: 0.9905 - val_loss: 0.0275 - val_acc: 0.9918
Epoch 95/200
 - 1s - loss: 0.0305 - acc: 0.9908 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 96/200
 - 1s - loss: 0.0308 - acc: 0.9915 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 97/200
 - 1s - loss: 0.0308 - acc: 0.9913 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 98/200
 - 1s - loss: 0.0297 - acc: 0.9908 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 99/200
 - 1s - loss: 0.0308 - acc: 0.9911 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 100/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 101/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 102/200
 - 1s - loss: 0.0315 - acc: 0.9900 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 103/200
 - 1s - loss: 0.0312 - acc: 0.9906 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 104/200
 - 1s - loss: 0.0306 - acc: 0.9906 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 105/200
 - 1s - loss: 0.0303 - acc: 0.9906 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 106/200
 - 1s - loss: 0.0303 - acc: 0.9910 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 107/200
 - 1s - loss: 0.0298 - acc: 0.9911 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 108/200
 - 1s - loss: 0.0315 - acc: 0.9902 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 109/200
 - 1s - loss: 0.0302 - acc: 0.9906 - val_loss: 0.0273 - val_acc: 0.9918
Epoch 110/200
 - 1s - loss: 0.0308 - acc: 0.9914 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 111/200
 - 1s - loss: 0.0306 - acc: 0.9912 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 112/200
 - 1s - loss: 0.0309 - acc: 0.9914 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 113/200
 - 1s - loss: 0.0304 - acc: 0.9913 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 114/200
 - 1s - loss: 0.0297 - acc: 0.9914 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 115/200
 - 1s - loss: 0.0307 - acc: 0.9908 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 116/200
 - 1s - loss: 0.0306 - acc: 0.9917 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 117/200
 - 1s - loss: 0.0298 - acc: 0.9911 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 118/200
 - 1s - loss: 0.0298 - acc: 0.9914 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 119/200
 - 1s - loss: 0.0310 - acc: 0.9908 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 120/200
 - 1s - loss: 0.0311 - acc: 0.9905 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 121/200
 - 1s - loss: 0.0300 - acc: 0.9909 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 122/200
 - 1s - loss: 0.0309 - acc: 0.9911 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 123/200
 - 1s - loss: 0.0303 - acc: 0.9914 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 124/200
 - 1s - loss: 0.0301 - acc: 0.9914 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 125/200
 - 1s - loss: 0.0295 - acc: 0.9915 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 126/200
 - 1s - loss: 0.0305 - acc: 0.9911 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 127/200
 - 1s - loss: 0.0303 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 128/200
 - 1s - loss: 0.0302 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9918
Epoch 129/200
 - 1s - loss: 0.0298 - acc: 0.9906 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 130/200
 - 1s - loss: 0.0308 - acc: 0.9908 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 131/200
 - 1s - loss: 0.0302 - acc: 0.9921 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 132/200
 - 1s - loss: 0.0297 - acc: 0.9917 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 133/200
 - 1s - loss: 0.0305 - acc: 0.9905 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 134/200
 - 1s - loss: 0.0297 - acc: 0.9909 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 135/200
 - 1s - loss: 0.0294 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 136/200
 - 1s - loss: 0.0306 - acc: 0.9902 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 137/200
 - 1s - loss: 0.0301 - acc: 0.9907 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 138/200
 - 1s - loss: 0.0307 - acc: 0.9910 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 139/200
 - 1s - loss: 0.0302 - acc: 0.9911 - val_loss: 0.0270 - val_acc: 0.9918
Epoch 140/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 141/200
 - 1s - loss: 0.0306 - acc: 0.9908 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 142/200
 - 1s - loss: 0.0300 - acc: 0.9909 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 143/200
 - 1s - loss: 0.0294 - acc: 0.9910 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 144/200
 - 1s - loss: 0.0307 - acc: 0.9905 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 145/200
 - 1s - loss: 0.0295 - acc: 0.9908 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 146/200
 - 1s - loss: 0.0296 - acc: 0.9913 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 147/200
 - 1s - loss: 0.0295 - acc: 0.9913 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 148/200
 - 1s - loss: 0.0300 - acc: 0.9911 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 149/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 150/200
 - 1s - loss: 0.0308 - acc: 0.9903 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 151/200
 - 1s - loss: 0.0307 - acc: 0.9908 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 152/200
 - 1s - loss: 0.0293 - acc: 0.9914 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 153/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 154/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 155/200
 - 1s - loss: 0.0299 - acc: 0.9906 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 156/200
 - 1s - loss: 0.0304 - acc: 0.9909 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 157/200
 - 1s - loss: 0.0299 - acc: 0.9909 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 158/200
 - 1s - loss: 0.0294 - acc: 0.9920 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 159/200
 - 1s - loss: 0.0298 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 160/200
 - 1s - loss: 0.0290 - acc: 0.9919 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 161/200
 - 1s - loss: 0.0293 - acc: 0.9910 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 162/200
 - 1s - loss: 0.0302 - acc: 0.9910 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 163/200
 - 1s - loss: 0.0285 - acc: 0.9920 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 164/200
 - 1s - loss: 0.0292 - acc: 0.9918 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 165/200
 - 1s - loss: 0.0300 - acc: 0.9909 - val_loss: 0.0268 - val_acc: 0.9918
Epoch 166/200
 - 1s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 167/200
 - 1s - loss: 0.0296 - acc: 0.9907 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 168/200
 - 1s - loss: 0.0296 - acc: 0.9913 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 169/200
 - 1s - loss: 0.0293 - acc: 0.9916 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 170/200
 - 1s - loss: 0.0297 - acc: 0.9908 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 171/200
 - 1s - loss: 0.0288 - acc: 0.9915 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 172/200
 - 1s - loss: 0.0299 - acc: 0.9917 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 173/200
 - 1s - loss: 0.0293 - acc: 0.9910 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 174/200
 - 1s - loss: 0.0295 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 175/200
 - 1s - loss: 0.0295 - acc: 0.9906 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 176/200
 - 1s - loss: 0.0291 - acc: 0.9912 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 177/200
 - 1s - loss: 0.0292 - acc: 0.9906 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 178/200
 - 1s - loss: 0.0297 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 179/200
 - 1s - loss: 0.0286 - acc: 0.9920 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 180/200
 - 1s - loss: 0.0303 - acc: 0.9909 - val_loss: 0.0267 - val_acc: 0.9918
Epoch 181/200
 - 1s - loss: 0.0297 - acc: 0.9909 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 182/200
 - 1s - loss: 0.0286 - acc: 0.9918 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 183/200
 - 1s - loss: 0.0291 - acc: 0.9915 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 184/200
 - 1s - loss: 0.0286 - acc: 0.9919 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 185/200
 - 1s - loss: 0.0289 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 186/200
 - 1s - loss: 0.0304 - acc: 0.9907 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 187/200
 - 1s - loss: 0.0290 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 188/200
 - 1s - loss: 0.0300 - acc: 0.9912 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 189/200
 - 1s - loss: 0.0308 - acc: 0.9908 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 190/200
 - 1s - loss: 0.0290 - acc: 0.9913 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 191/200
 - 1s - loss: 0.0283 - acc: 0.9923 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 192/200
 - 1s - loss: 0.0294 - acc: 0.9916 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 193/200
 - 1s - loss: 0.0297 - acc: 0.9913 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 194/200
 - 1s - loss: 0.0305 - acc: 0.9909 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 195/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 196/200
 - 1s - loss: 0.0291 - acc: 0.9910 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 197/200
 - 1s - loss: 0.0297 - acc: 0.9908 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 198/200
 - 1s - loss: 0.0287 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9918
Epoch 199/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0265 - val_acc: 0.9918
Epoch 200/200
 - 1s - loss: 0.0304 - acc: 0.9904 - val_loss: 0.0265 - val_acc: 0.9918
2018-03-27 10:08:18,474 [INFO] Evaluate...
2018-03-27 10:08:20,991 [INFO] Done!
2018-03-27 10:08:20,997 [INFO] tpe_transform took 0.002491 seconds
2018-03-27 10:08:20,998 [INFO] TPE using 28/28 trials with best loss 0.013169
2018-03-27 10:08:21,006 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:08:21,991 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.0821 - acc: 0.9778 - val_loss: 0.0379 - val_acc: 0.9910
Epoch 2/200
 - 1s - loss: 0.0406 - acc: 0.9896 - val_loss: 0.0320 - val_acc: 0.9914
Epoch 3/200
 - 1s - loss: 0.0351 - acc: 0.9897 - val_loss: 0.0296 - val_acc: 0.9922
Epoch 4/200
 - 1s - loss: 0.0330 - acc: 0.9899 - val_loss: 0.0283 - val_acc: 0.9924
Epoch 5/200
 - 1s - loss: 0.0325 - acc: 0.9902 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0322 - acc: 0.9904 - val_loss: 0.0268 - val_acc: 0.9922
Epoch 7/200
 - 1s - loss: 0.0299 - acc: 0.9908 - val_loss: 0.0263 - val_acc: 0.9926
Epoch 8/200
 - 1s - loss: 0.0293 - acc: 0.9918 - val_loss: 0.0259 - val_acc: 0.9924
Epoch 9/200
 - 1s - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0256 - val_acc: 0.9924
Epoch 10/200
 - 1s - loss: 0.0279 - acc: 0.9920 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0283 - acc: 0.9916 - val_loss: 0.0251 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0285 - acc: 0.9916 - val_loss: 0.0249 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0273 - acc: 0.9922 - val_loss: 0.0247 - val_acc: 0.9924
Epoch 14/200
 - 1s - loss: 0.0277 - acc: 0.9912 - val_loss: 0.0246 - val_acc: 0.9924
Epoch 15/200
 - 1s - loss: 0.0271 - acc: 0.9918 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 16/200
 - 1s - loss: 0.0269 - acc: 0.9917 - val_loss: 0.0243 - val_acc: 0.9926
Epoch 17/200
 - 1s - loss: 0.0266 - acc: 0.9923 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 18/200
 - 1s - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0241 - val_acc: 0.9924
Epoch 19/200
 - 1s - loss: 0.0257 - acc: 0.9920 - val_loss: 0.0240 - val_acc: 0.9924
Epoch 20/200
 - 1s - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0239 - val_acc: 0.9924
Epoch 21/200
 - 1s - loss: 0.0260 - acc: 0.9921 - val_loss: 0.0239 - val_acc: 0.9924
Epoch 22/200
 - 1s - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0238 - val_acc: 0.9924
Epoch 23/200
 - 1s - loss: 0.0251 - acc: 0.9923 - val_loss: 0.0237 - val_acc: 0.9926
Epoch 24/200
 - 1s - loss: 0.0261 - acc: 0.9919 - val_loss: 0.0237 - val_acc: 0.9924
Epoch 25/200
 - 1s - loss: 0.0256 - acc: 0.9925 - val_loss: 0.0236 - val_acc: 0.9926
Epoch 26/200
 - 1s - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9926
Epoch 27/200
 - 1s - loss: 0.0245 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 28/200
 - 1s - loss: 0.0255 - acc: 0.9921 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 29/200
 - 1s - loss: 0.0247 - acc: 0.9927 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 30/200
 - 1s - loss: 0.0258 - acc: 0.9913 - val_loss: 0.0234 - val_acc: 0.9926
Epoch 31/200
 - 1s - loss: 0.0254 - acc: 0.9922 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 32/200
 - 1s - loss: 0.0250 - acc: 0.9925 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 33/200
 - 1s - loss: 0.0243 - acc: 0.9925 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 34/200
 - 1s - loss: 0.0256 - acc: 0.9922 - val_loss: 0.0232 - val_acc: 0.9926
Epoch 35/200
 - 1s - loss: 0.0251 - acc: 0.9923 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 36/200
 - 1s - loss: 0.0244 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 37/200
 - 1s - loss: 0.0249 - acc: 0.9920 - val_loss: 0.0231 - val_acc: 0.9926
Epoch 38/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 39/200
 - 1s - loss: 0.0243 - acc: 0.9927 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 40/200
 - 1s - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9926
Epoch 41/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 42/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 43/200
 - 1s - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 44/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 45/200
 - 1s - loss: 0.0246 - acc: 0.9923 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 46/200
 - 1s - loss: 0.0246 - acc: 0.9923 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 47/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 48/200
 - 1s - loss: 0.0252 - acc: 0.9920 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 49/200
 - 1s - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 50/200
 - 1s - loss: 0.0253 - acc: 0.9917 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 51/200
 - 1s - loss: 0.0237 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 52/200
 - 1s - loss: 0.0250 - acc: 0.9916 - val_loss: 0.0227 - val_acc: 0.9926
Epoch 53/200
 - 1s - loss: 0.0242 - acc: 0.9923 - val_loss: 0.0226 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9926
Epoch 55/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9924
Epoch 56/200
 - 1s - loss: 0.0249 - acc: 0.9923 - val_loss: 0.0226 - val_acc: 0.9924
Epoch 57/200
 - 1s - loss: 0.0247 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9924
Epoch 58/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9924
Epoch 59/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9924
Epoch 60/200
 - 1s - loss: 0.0240 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9924
Epoch 61/200
 - 1s - loss: 0.0240 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9926
Epoch 62/200
 - 1s - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9926
Epoch 63/200
 - 1s - loss: 0.0238 - acc: 0.9930 - val_loss: 0.0225 - val_acc: 0.9926
Epoch 64/200
 - 1s - loss: 0.0237 - acc: 0.9923 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 65/200
 - 1s - loss: 0.0238 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 66/200
 - 1s - loss: 0.0243 - acc: 0.9923 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 67/200
 - 1s - loss: 0.0249 - acc: 0.9921 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 68/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 69/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 70/200
 - 1s - loss: 0.0230 - acc: 0.9927 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 71/200
 - 1s - loss: 0.0228 - acc: 0.9930 - val_loss: 0.0223 - val_acc: 0.9926
Epoch 72/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9926
Epoch 73/200
 - 1s - loss: 0.0231 - acc: 0.9934 - val_loss: 0.0223 - val_acc: 0.9926
Epoch 74/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9924
Epoch 75/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0223 - val_acc: 0.9924
Epoch 76/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9924
Epoch 77/200
 - 1s - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0223 - val_acc: 0.9926
Epoch 78/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 79/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 80/200
 - 1s - loss: 0.0235 - acc: 0.9918 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 81/200
 - 1s - loss: 0.0231 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 82/200
 - 1s - loss: 0.0243 - acc: 0.9923 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 83/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0222 - val_acc: 0.9924
Epoch 84/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 85/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 86/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9926
Epoch 87/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 88/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9926
Epoch 89/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9926
Epoch 90/200
 - 1s - loss: 0.0223 - acc: 0.9930 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 91/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 92/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 93/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 94/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 95/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 96/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 97/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 98/200
 - 1s - loss: 0.0222 - acc: 0.9934 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 99/200
 - 1s - loss: 0.0239 - acc: 0.9926 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 100/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 101/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 102/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0220 - val_acc: 0.9928
Epoch 103/200
 - 1s - loss: 0.0234 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9928
Epoch 104/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 105/200
 - 1s - loss: 0.0224 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 106/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 107/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 108/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 109/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 110/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0219 - val_acc: 0.9926
Epoch 111/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 112/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 113/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 114/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 115/200
 - 1s - loss: 0.0235 - acc: 0.9924 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 116/200
 - 1s - loss: 0.0235 - acc: 0.9930 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 117/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 118/200
 - 1s - loss: 0.0237 - acc: 0.9923 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 119/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 120/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 121/200
 - 1s - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 122/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 123/200
 - 1s - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 124/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 125/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 126/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 127/200
 - 1s - loss: 0.0221 - acc: 0.9938 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 128/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 129/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 130/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 131/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 132/200
 - 1s - loss: 0.0221 - acc: 0.9938 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 133/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 134/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 135/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 136/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 137/200
 - 1s - loss: 0.0233 - acc: 0.9922 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 138/200
 - 1s - loss: 0.0230 - acc: 0.9922 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 139/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 140/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9924
Epoch 141/200
 - 1s - loss: 0.0234 - acc: 0.9924 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 142/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 143/200
 - 1s - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 144/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 145/200
 - 1s - loss: 0.0231 - acc: 0.9924 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 146/200
 - 1s - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 147/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 148/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 149/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 150/200
 - 1s - loss: 0.0225 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 151/200
 - 1s - loss: 0.0223 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 152/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 153/200
 - 1s - loss: 0.0229 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 154/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 155/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 156/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 157/200
 - 1s - loss: 0.0217 - acc: 0.9927 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 158/200
 - 1s - loss: 0.0228 - acc: 0.9924 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 159/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 160/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 161/200
 - 1s - loss: 0.0230 - acc: 0.9924 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 162/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 163/200
 - 1s - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 164/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 165/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 166/200
 - 1s - loss: 0.0223 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 167/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 168/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 169/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 170/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 171/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 172/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 173/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 174/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 175/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 176/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 177/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 178/200
 - 1s - loss: 0.0232 - acc: 0.9924 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 179/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 180/200
 - 1s - loss: 0.0230 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 181/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 182/200
 - 1s - loss: 0.0228 - acc: 0.9927 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 183/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9924
Epoch 184/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 185/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 186/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 187/200
 - 1s - loss: 0.0213 - acc: 0.9939 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 188/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 189/200
 - 1s - loss: 0.0220 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 190/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 191/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 192/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 193/200
 - 1s - loss: 0.0229 - acc: 0.9923 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 194/200
 - 1s - loss: 0.0226 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 195/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 196/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 197/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 198/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 199/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9924
Epoch 200/200
 - 1s - loss: 0.0234 - acc: 0.9925 - val_loss: 0.0215 - val_acc: 0.9924
2018-03-27 10:11:28,273 [INFO] Evaluate...
2018-03-27 10:11:30,899 [INFO] Done!
2018-03-27 10:11:30,905 [INFO] tpe_transform took 0.002559 seconds
2018-03-27 10:11:30,906 [INFO] TPE using 29/29 trials with best loss 0.013169
2018-03-27 10:11:30,912 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:11:31,928 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.1579 - acc: 0.9507 - val_loss: 0.0591 - val_acc: 0.9876
Epoch 2/200
 - 1s - loss: 0.0627 - acc: 0.9849 - val_loss: 0.0452 - val_acc: 0.9882
Epoch 3/200
 - 1s - loss: 0.0527 - acc: 0.9859 - val_loss: 0.0397 - val_acc: 0.9886
Epoch 4/200
 - 1s - loss: 0.0464 - acc: 0.9881 - val_loss: 0.0366 - val_acc: 0.9892
Epoch 5/200
 - 1s - loss: 0.0427 - acc: 0.9890 - val_loss: 0.0346 - val_acc: 0.9892
Epoch 6/200
 - 1s - loss: 0.0417 - acc: 0.9884 - val_loss: 0.0332 - val_acc: 0.9892
Epoch 7/200
 - 1s - loss: 0.0396 - acc: 0.9895 - val_loss: 0.0321 - val_acc: 0.9894
Epoch 8/200
 - 1s - loss: 0.0392 - acc: 0.9891 - val_loss: 0.0312 - val_acc: 0.9898
Epoch 9/200
 - 1s - loss: 0.0375 - acc: 0.9895 - val_loss: 0.0306 - val_acc: 0.9896
Epoch 10/200
 - 1s - loss: 0.0369 - acc: 0.9899 - val_loss: 0.0300 - val_acc: 0.9906
Epoch 11/200
 - 1s - loss: 0.0358 - acc: 0.9896 - val_loss: 0.0295 - val_acc: 0.9906
Epoch 12/200
 - 1s - loss: 0.0347 - acc: 0.9903 - val_loss: 0.0291 - val_acc: 0.9906
Epoch 13/200
 - 1s - loss: 0.0344 - acc: 0.9910 - val_loss: 0.0288 - val_acc: 0.9906
Epoch 14/200
 - 1s - loss: 0.0334 - acc: 0.9905 - val_loss: 0.0284 - val_acc: 0.9906
Epoch 15/200
 - 1s - loss: 0.0333 - acc: 0.9909 - val_loss: 0.0282 - val_acc: 0.9906
Epoch 16/200
 - 1s - loss: 0.0331 - acc: 0.9903 - val_loss: 0.0279 - val_acc: 0.9906
Epoch 17/200
 - 1s - loss: 0.0329 - acc: 0.9907 - val_loss: 0.0277 - val_acc: 0.9906
Epoch 18/200
 - 1s - loss: 0.0330 - acc: 0.9906 - val_loss: 0.0275 - val_acc: 0.9908
Epoch 19/200
 - 1s - loss: 0.0322 - acc: 0.9909 - val_loss: 0.0273 - val_acc: 0.9908
Epoch 20/200
 - 1s - loss: 0.0312 - acc: 0.9913 - val_loss: 0.0271 - val_acc: 0.9910
Epoch 21/200
 - 1s - loss: 0.0335 - acc: 0.9896 - val_loss: 0.0269 - val_acc: 0.9910
Epoch 22/200
 - 1s - loss: 0.0314 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9908
Epoch 23/200
 - 1s - loss: 0.0325 - acc: 0.9909 - val_loss: 0.0266 - val_acc: 0.9910
Epoch 24/200
 - 1s - loss: 0.0325 - acc: 0.9905 - val_loss: 0.0265 - val_acc: 0.9912
Epoch 25/200
 - 1s - loss: 0.0309 - acc: 0.9917 - val_loss: 0.0264 - val_acc: 0.9910
Epoch 26/200
 - 1s - loss: 0.0306 - acc: 0.9910 - val_loss: 0.0263 - val_acc: 0.9910
Epoch 27/200
 - 1s - loss: 0.0314 - acc: 0.9910 - val_loss: 0.0262 - val_acc: 0.9908
Epoch 28/200
 - 1s - loss: 0.0307 - acc: 0.9917 - val_loss: 0.0261 - val_acc: 0.9910
Epoch 29/200
 - 1s - loss: 0.0307 - acc: 0.9905 - val_loss: 0.0260 - val_acc: 0.9912
Epoch 30/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0259 - val_acc: 0.9912
Epoch 31/200
 - 1s - loss: 0.0311 - acc: 0.9909 - val_loss: 0.0258 - val_acc: 0.9914
Epoch 32/200
 - 1s - loss: 0.0290 - acc: 0.9921 - val_loss: 0.0257 - val_acc: 0.9914
Epoch 33/200
 - 1s - loss: 0.0305 - acc: 0.9914 - val_loss: 0.0256 - val_acc: 0.9914
Epoch 34/200
 - 1s - loss: 0.0302 - acc: 0.9914 - val_loss: 0.0255 - val_acc: 0.9914
Epoch 35/200
 - 1s - loss: 0.0297 - acc: 0.9915 - val_loss: 0.0255 - val_acc: 0.9914
Epoch 36/200
 - 1s - loss: 0.0298 - acc: 0.9916 - val_loss: 0.0254 - val_acc: 0.9914
Epoch 37/200
 - 1s - loss: 0.0300 - acc: 0.9919 - val_loss: 0.0253 - val_acc: 0.9914
Epoch 38/200
 - 1s - loss: 0.0303 - acc: 0.9914 - val_loss: 0.0253 - val_acc: 0.9914
Epoch 39/200
 - 1s - loss: 0.0302 - acc: 0.9911 - val_loss: 0.0252 - val_acc: 0.9914
Epoch 40/200
 - 1s - loss: 0.0290 - acc: 0.9917 - val_loss: 0.0251 - val_acc: 0.9914
Epoch 41/200
 - 1s - loss: 0.0296 - acc: 0.9917 - val_loss: 0.0251 - val_acc: 0.9914
Epoch 42/200
 - 1s - loss: 0.0291 - acc: 0.9918 - val_loss: 0.0250 - val_acc: 0.9914
Epoch 43/200
 - 1s - loss: 0.0296 - acc: 0.9918 - val_loss: 0.0250 - val_acc: 0.9914
Epoch 44/200
 - 1s - loss: 0.0290 - acc: 0.9920 - val_loss: 0.0249 - val_acc: 0.9914
Epoch 45/200
 - 1s - loss: 0.0297 - acc: 0.9914 - val_loss: 0.0249 - val_acc: 0.9914
Epoch 46/200
 - 1s - loss: 0.0287 - acc: 0.9918 - val_loss: 0.0248 - val_acc: 0.9916
Epoch 47/200
 - 1s - loss: 0.0286 - acc: 0.9919 - val_loss: 0.0248 - val_acc: 0.9916
Epoch 48/200
 - 1s - loss: 0.0287 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 49/200
 - 1s - loss: 0.0281 - acc: 0.9924 - val_loss: 0.0247 - val_acc: 0.9916
Epoch 50/200
 - 1s - loss: 0.0288 - acc: 0.9914 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 51/200
 - 1s - loss: 0.0282 - acc: 0.9918 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 52/200
 - 1s - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9916
Epoch 53/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 54/200
 - 1s - loss: 0.0281 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9916
Epoch 55/200
 - 1s - loss: 0.0285 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 56/200
 - 1s - loss: 0.0292 - acc: 0.9915 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 57/200
 - 1s - loss: 0.0276 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9916
Epoch 58/200
 - 1s - loss: 0.0283 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9916
Epoch 59/200
 - 1s - loss: 0.0283 - acc: 0.9915 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 60/200
 - 1s - loss: 0.0266 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9918
Epoch 61/200
 - 1s - loss: 0.0279 - acc: 0.9916 - val_loss: 0.0242 - val_acc: 0.9916
Epoch 62/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0242 - val_acc: 0.9916
Epoch 63/200
 - 1s - loss: 0.0280 - acc: 0.9919 - val_loss: 0.0242 - val_acc: 0.9918
Epoch 64/200
 - 1s - loss: 0.0277 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 65/200
 - 1s - loss: 0.0277 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 66/200
 - 1s - loss: 0.0273 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 67/200
 - 1s - loss: 0.0275 - acc: 0.9917 - val_loss: 0.0241 - val_acc: 0.9918
Epoch 68/200
 - 1s - loss: 0.0279 - acc: 0.9917 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 69/200
 - 1s - loss: 0.0282 - acc: 0.9915 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 70/200
 - 1s - loss: 0.0276 - acc: 0.9914 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 71/200
 - 1s - loss: 0.0277 - acc: 0.9917 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 72/200
 - 1s - loss: 0.0275 - acc: 0.9917 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 73/200
 - 1s - loss: 0.0270 - acc: 0.9923 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 74/200
 - 1s - loss: 0.0276 - acc: 0.9917 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 75/200
 - 1s - loss: 0.0271 - acc: 0.9923 - val_loss: 0.0239 - val_acc: 0.9918
Epoch 76/200
 - 1s - loss: 0.0276 - acc: 0.9913 - val_loss: 0.0238 - val_acc: 0.9918
Epoch 77/200
 - 1s - loss: 0.0283 - acc: 0.9918 - val_loss: 0.0238 - val_acc: 0.9918
Epoch 78/200
 - 1s - loss: 0.0276 - acc: 0.9919 - val_loss: 0.0238 - val_acc: 0.9920
Epoch 79/200
 - 1s - loss: 0.0273 - acc: 0.9922 - val_loss: 0.0238 - val_acc: 0.9920
Epoch 80/200
 - 1s - loss: 0.0284 - acc: 0.9914 - val_loss: 0.0237 - val_acc: 0.9920
Epoch 81/200
 - 1s - loss: 0.0274 - acc: 0.9919 - val_loss: 0.0237 - val_acc: 0.9920
Epoch 82/200
 - 1s - loss: 0.0267 - acc: 0.9922 - val_loss: 0.0237 - val_acc: 0.9920
Epoch 83/200
 - 1s - loss: 0.0279 - acc: 0.9922 - val_loss: 0.0237 - val_acc: 0.9918
Epoch 84/200
 - 1s - loss: 0.0277 - acc: 0.9918 - val_loss: 0.0237 - val_acc: 0.9918
Epoch 85/200
 - 1s - loss: 0.0269 - acc: 0.9920 - val_loss: 0.0236 - val_acc: 0.9918
Epoch 86/200
 - 1s - loss: 0.0271 - acc: 0.9918 - val_loss: 0.0236 - val_acc: 0.9918
Epoch 87/200
 - 1s - loss: 0.0268 - acc: 0.9919 - val_loss: 0.0236 - val_acc: 0.9918
Epoch 88/200
 - 1s - loss: 0.0273 - acc: 0.9922 - val_loss: 0.0236 - val_acc: 0.9918
Epoch 89/200
 - 1s - loss: 0.0281 - acc: 0.9918 - val_loss: 0.0236 - val_acc: 0.9918
Epoch 90/200
 - 1s - loss: 0.0277 - acc: 0.9919 - val_loss: 0.0235 - val_acc: 0.9918
Epoch 91/200
 - 1s - loss: 0.0270 - acc: 0.9919 - val_loss: 0.0235 - val_acc: 0.9918
Epoch 92/200
 - 1s - loss: 0.0271 - acc: 0.9923 - val_loss: 0.0235 - val_acc: 0.9918
Epoch 93/200
 - 1s - loss: 0.0289 - acc: 0.9914 - val_loss: 0.0235 - val_acc: 0.9918
Epoch 94/200
 - 1s - loss: 0.0268 - acc: 0.9924 - val_loss: 0.0235 - val_acc: 0.9918
Epoch 95/200
 - 1s - loss: 0.0271 - acc: 0.9919 - val_loss: 0.0235 - val_acc: 0.9918
Epoch 96/200
 - 1s - loss: 0.0280 - acc: 0.9917 - val_loss: 0.0234 - val_acc: 0.9918
Epoch 97/200
 - 1s - loss: 0.0276 - acc: 0.9918 - val_loss: 0.0234 - val_acc: 0.9918
Epoch 98/200
 - 1s - loss: 0.0271 - acc: 0.9922 - val_loss: 0.0234 - val_acc: 0.9918
Epoch 99/200
 - 1s - loss: 0.0269 - acc: 0.9924 - val_loss: 0.0234 - val_acc: 0.9918
Epoch 100/200
 - 1s - loss: 0.0271 - acc: 0.9922 - val_loss: 0.0234 - val_acc: 0.9918
Epoch 101/200
 - 1s - loss: 0.0273 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9918
Epoch 102/200
 - 1s - loss: 0.0261 - acc: 0.9927 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 103/200
 - 1s - loss: 0.0264 - acc: 0.9920 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 104/200
 - 1s - loss: 0.0273 - acc: 0.9923 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 105/200
 - 1s - loss: 0.0267 - acc: 0.9923 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 106/200
 - 1s - loss: 0.0267 - acc: 0.9922 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 107/200
 - 1s - loss: 0.0260 - acc: 0.9922 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 108/200
 - 1s - loss: 0.0263 - acc: 0.9919 - val_loss: 0.0233 - val_acc: 0.9918
Epoch 109/200
 - 1s - loss: 0.0271 - acc: 0.9916 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 110/200
 - 1s - loss: 0.0274 - acc: 0.9923 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 111/200
 - 1s - loss: 0.0271 - acc: 0.9920 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 112/200
 - 1s - loss: 0.0267 - acc: 0.9920 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 113/200
 - 1s - loss: 0.0271 - acc: 0.9917 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 114/200
 - 1s - loss: 0.0265 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 115/200
 - 1s - loss: 0.0266 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 116/200
 - 1s - loss: 0.0269 - acc: 0.9922 - val_loss: 0.0232 - val_acc: 0.9918
Epoch 117/200
 - 1s - loss: 0.0260 - acc: 0.9927 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 118/200
 - 1s - loss: 0.0270 - acc: 0.9917 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 119/200
 - 1s - loss: 0.0260 - acc: 0.9924 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 120/200
 - 1s - loss: 0.0271 - acc: 0.9916 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 121/200
 - 1s - loss: 0.0259 - acc: 0.9920 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 122/200
 - 1s - loss: 0.0265 - acc: 0.9917 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 123/200
 - 1s - loss: 0.0261 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 124/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 125/200
 - 1s - loss: 0.0273 - acc: 0.9919 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 126/200
 - 1s - loss: 0.0268 - acc: 0.9920 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 127/200
 - 1s - loss: 0.0266 - acc: 0.9930 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 128/200
 - 1s - loss: 0.0269 - acc: 0.9920 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 129/200
 - 1s - loss: 0.0263 - acc: 0.9919 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 130/200
 - 1s - loss: 0.0255 - acc: 0.9919 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 131/200
 - 1s - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 132/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 133/200
 - 1s - loss: 0.0259 - acc: 0.9920 - val_loss: 0.0230 - val_acc: 0.9918
Epoch 134/200
 - 1s - loss: 0.0258 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 135/200
 - 1s - loss: 0.0256 - acc: 0.9926 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 136/200
 - 1s - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 137/200
 - 1s - loss: 0.0263 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 138/200
 - 1s - loss: 0.0260 - acc: 0.9922 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 139/200
 - 1s - loss: 0.0260 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 140/200
 - 1s - loss: 0.0266 - acc: 0.9920 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 141/200
 - 1s - loss: 0.0262 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 142/200
 - 1s - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 143/200
 - 1s - loss: 0.0257 - acc: 0.9922 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 144/200
 - 1s - loss: 0.0248 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 145/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 146/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 147/200
 - 1s - loss: 0.0261 - acc: 0.9924 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 148/200
 - 1s - loss: 0.0267 - acc: 0.9920 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 149/200
 - 1s - loss: 0.0260 - acc: 0.9924 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 150/200
 - 1s - loss: 0.0261 - acc: 0.9920 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 151/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 152/200
 - 1s - loss: 0.0267 - acc: 0.9921 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 153/200
 - 1s - loss: 0.0252 - acc: 0.9923 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 154/200
 - 1s - loss: 0.0265 - acc: 0.9927 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 155/200
 - 1s - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 156/200
 - 1s - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 157/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 158/200
 - 1s - loss: 0.0249 - acc: 0.9925 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 159/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 160/200
 - 1s - loss: 0.0258 - acc: 0.9936 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 161/200
 - 1s - loss: 0.0268 - acc: 0.9919 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 162/200
 - 1s - loss: 0.0264 - acc: 0.9926 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 163/200
 - 1s - loss: 0.0262 - acc: 0.9921 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 164/200
 - 1s - loss: 0.0263 - acc: 0.9924 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 165/200
 - 1s - loss: 0.0259 - acc: 0.9925 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 166/200
 - 1s - loss: 0.0259 - acc: 0.9922 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 167/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 168/200
 - 1s - loss: 0.0256 - acc: 0.9923 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 169/200
 - 1s - loss: 0.0257 - acc: 0.9922 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 170/200
 - 1s - loss: 0.0253 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 171/200
 - 1s - loss: 0.0268 - acc: 0.9922 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 172/200
 - 1s - loss: 0.0262 - acc: 0.9925 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 173/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 174/200
 - 1s - loss: 0.0247 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 175/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 176/200
 - 1s - loss: 0.0260 - acc: 0.9922 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 177/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 178/200
 - 1s - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 179/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 180/200
 - 1s - loss: 0.0266 - acc: 0.9922 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 181/200
 - 1s - loss: 0.0261 - acc: 0.9923 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 182/200
 - 1s - loss: 0.0263 - acc: 0.9913 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 183/200
 - 1s - loss: 0.0258 - acc: 0.9919 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 184/200
 - 1s - loss: 0.0254 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 185/200
 - 1s - loss: 0.0261 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 186/200
 - 1s - loss: 0.0256 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 187/200
 - 1s - loss: 0.0255 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 188/200
 - 1s - loss: 0.0258 - acc: 0.9919 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 189/200
 - 1s - loss: 0.0260 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 190/200
 - 1s - loss: 0.0258 - acc: 0.9919 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 191/200
 - 1s - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 192/200
 - 1s - loss: 0.0251 - acc: 0.9936 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 193/200
 - 1s - loss: 0.0257 - acc: 0.9922 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 194/200
 - 1s - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 195/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 196/200
 - 1s - loss: 0.0248 - acc: 0.9935 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 197/200
 - 1s - loss: 0.0256 - acc: 0.9925 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 198/200
 - 1s - loss: 0.0251 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 199/200
 - 1s - loss: 0.0261 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 200/200
 - 1s - loss: 0.0253 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9920
2018-03-27 10:14:38,190 [INFO] Evaluate...
2018-03-27 10:14:40,787 [INFO] Done!
2018-03-27 10:14:40,793 [INFO] tpe_transform took 0.002490 seconds
2018-03-27 10:14:40,794 [INFO] TPE using 30/30 trials with best loss 0.013169
2018-03-27 10:14:40,802 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:14:41,789 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 4s - loss: 0.1205 - acc: 0.9630 - val_loss: 0.0519 - val_acc: 0.9904
Epoch 2/200
 - 1s - loss: 0.0569 - acc: 0.9854 - val_loss: 0.0423 - val_acc: 0.9908
Epoch 3/200
 - 1s - loss: 0.0495 - acc: 0.9872 - val_loss: 0.0387 - val_acc: 0.9914
Epoch 4/200
 - 1s - loss: 0.0450 - acc: 0.9883 - val_loss: 0.0366 - val_acc: 0.9916
Epoch 5/200
 - 1s - loss: 0.0432 - acc: 0.9888 - val_loss: 0.0351 - val_acc: 0.9918
Epoch 6/200
 - 1s - loss: 0.0409 - acc: 0.9886 - val_loss: 0.0341 - val_acc: 0.9920
Epoch 7/200
 - 1s - loss: 0.0407 - acc: 0.9887 - val_loss: 0.0333 - val_acc: 0.9922
Epoch 8/200
 - 1s - loss: 0.0402 - acc: 0.9883 - val_loss: 0.0327 - val_acc: 0.9922
Epoch 9/200
 - 1s - loss: 0.0387 - acc: 0.9897 - val_loss: 0.0322 - val_acc: 0.9922
Epoch 10/200
 - 1s - loss: 0.0387 - acc: 0.9896 - val_loss: 0.0317 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0391 - acc: 0.9892 - val_loss: 0.0313 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0375 - acc: 0.9894 - val_loss: 0.0310 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0367 - acc: 0.9896 - val_loss: 0.0307 - val_acc: 0.9920
Epoch 14/200
 - 1s - loss: 0.0375 - acc: 0.9897 - val_loss: 0.0305 - val_acc: 0.9920
Epoch 15/200
 - 1s - loss: 0.0374 - acc: 0.9893 - val_loss: 0.0302 - val_acc: 0.9920
Epoch 16/200
 - 1s - loss: 0.0363 - acc: 0.9897 - val_loss: 0.0300 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0363 - acc: 0.9900 - val_loss: 0.0298 - val_acc: 0.9922
Epoch 18/200
 - 1s - loss: 0.0364 - acc: 0.9902 - val_loss: 0.0297 - val_acc: 0.9922
Epoch 19/200
 - 1s - loss: 0.0360 - acc: 0.9900 - val_loss: 0.0295 - val_acc: 0.9922
Epoch 20/200
 - 1s - loss: 0.0351 - acc: 0.9909 - val_loss: 0.0293 - val_acc: 0.9922
Epoch 21/200
 - 1s - loss: 0.0343 - acc: 0.9910 - val_loss: 0.0292 - val_acc: 0.9922
Epoch 22/200
 - 1s - loss: 0.0351 - acc: 0.9908 - val_loss: 0.0291 - val_acc: 0.9922
Epoch 23/200
 - 1s - loss: 0.0340 - acc: 0.9904 - val_loss: 0.0289 - val_acc: 0.9922
Epoch 24/200
 - 1s - loss: 0.0348 - acc: 0.9902 - val_loss: 0.0288 - val_acc: 0.9922
Epoch 25/200
 - 1s - loss: 0.0334 - acc: 0.9906 - val_loss: 0.0287 - val_acc: 0.9922
Epoch 26/200
 - 1s - loss: 0.0335 - acc: 0.9904 - val_loss: 0.0286 - val_acc: 0.9922
Epoch 27/200
 - 1s - loss: 0.0338 - acc: 0.9908 - val_loss: 0.0285 - val_acc: 0.9922
Epoch 28/200
 - 1s - loss: 0.0336 - acc: 0.9900 - val_loss: 0.0284 - val_acc: 0.9922
Epoch 29/200
 - 1s - loss: 0.0332 - acc: 0.9906 - val_loss: 0.0283 - val_acc: 0.9922
Epoch 30/200
 - 1s - loss: 0.0327 - acc: 0.9908 - val_loss: 0.0282 - val_acc: 0.9922
Epoch 31/200
 - 1s - loss: 0.0333 - acc: 0.9905 - val_loss: 0.0282 - val_acc: 0.9922
Epoch 32/200
 - 1s - loss: 0.0333 - acc: 0.9903 - val_loss: 0.0281 - val_acc: 0.9922
Epoch 33/200
 - 1s - loss: 0.0347 - acc: 0.9895 - val_loss: 0.0280 - val_acc: 0.9922
Epoch 34/200
 - 1s - loss: 0.0328 - acc: 0.9909 - val_loss: 0.0279 - val_acc: 0.9922
Epoch 35/200
 - 1s - loss: 0.0328 - acc: 0.9908 - val_loss: 0.0279 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0333 - acc: 0.9902 - val_loss: 0.0278 - val_acc: 0.9922
Epoch 37/200
 - 1s - loss: 0.0322 - acc: 0.9909 - val_loss: 0.0278 - val_acc: 0.9924
Epoch 38/200
 - 1s - loss: 0.0320 - acc: 0.9910 - val_loss: 0.0277 - val_acc: 0.9924
Epoch 39/200
 - 1s - loss: 0.0329 - acc: 0.9906 - val_loss: 0.0276 - val_acc: 0.9924
Epoch 40/200
 - 1s - loss: 0.0333 - acc: 0.9901 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 41/200
 - 1s - loss: 0.0328 - acc: 0.9902 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 42/200
 - 1s - loss: 0.0326 - acc: 0.9907 - val_loss: 0.0275 - val_acc: 0.9924
Epoch 43/200
 - 1s - loss: 0.0317 - acc: 0.9905 - val_loss: 0.0274 - val_acc: 0.9924
Epoch 44/200
 - 1s - loss: 0.0320 - acc: 0.9915 - val_loss: 0.0274 - val_acc: 0.9924
Epoch 45/200
 - 1s - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 46/200
 - 1s - loss: 0.0328 - acc: 0.9904 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 47/200
 - 1s - loss: 0.0331 - acc: 0.9902 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 48/200
 - 1s - loss: 0.0327 - acc: 0.9912 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 49/200
 - 1s - loss: 0.0324 - acc: 0.9909 - val_loss: 0.0272 - val_acc: 0.9928
Epoch 50/200
 - 1s - loss: 0.0314 - acc: 0.9907 - val_loss: 0.0271 - val_acc: 0.9928
Epoch 51/200
 - 1s - loss: 0.0315 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9928
Epoch 52/200
 - 1s - loss: 0.0315 - acc: 0.9914 - val_loss: 0.0271 - val_acc: 0.9930
Epoch 53/200
 - 1s - loss: 0.0309 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9930
Epoch 54/200
 - 1s - loss: 0.0319 - acc: 0.9903 - val_loss: 0.0270 - val_acc: 0.9930
Epoch 55/200
 - 1s - loss: 0.0316 - acc: 0.9911 - val_loss: 0.0269 - val_acc: 0.9930
Epoch 56/200
 - 1s - loss: 0.0318 - acc: 0.9902 - val_loss: 0.0269 - val_acc: 0.9930
Epoch 57/200
 - 1s - loss: 0.0311 - acc: 0.9909 - val_loss: 0.0269 - val_acc: 0.9930
Epoch 58/200
 - 1s - loss: 0.0311 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9930
Epoch 59/200
 - 1s - loss: 0.0320 - acc: 0.9909 - val_loss: 0.0268 - val_acc: 0.9930
Epoch 60/200
 - 1s - loss: 0.0316 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9930
Epoch 61/200
 - 1s - loss: 0.0311 - acc: 0.9902 - val_loss: 0.0267 - val_acc: 0.9930
Epoch 62/200
 - 1s - loss: 0.0319 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9930
Epoch 63/200
 - 1s - loss: 0.0319 - acc: 0.9909 - val_loss: 0.0267 - val_acc: 0.9930
Epoch 64/200
 - 1s - loss: 0.0326 - acc: 0.9902 - val_loss: 0.0267 - val_acc: 0.9930
Epoch 65/200
 - 1s - loss: 0.0309 - acc: 0.9911 - val_loss: 0.0266 - val_acc: 0.9930
Epoch 66/200
 - 1s - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9930
Epoch 67/200
 - 1s - loss: 0.0323 - acc: 0.9904 - val_loss: 0.0266 - val_acc: 0.9930
Epoch 68/200
 - 1s - loss: 0.0312 - acc: 0.9915 - val_loss: 0.0266 - val_acc: 0.9930
Epoch 69/200
 - 1s - loss: 0.0314 - acc: 0.9913 - val_loss: 0.0265 - val_acc: 0.9930
Epoch 70/200
 - 1s - loss: 0.0298 - acc: 0.9914 - val_loss: 0.0265 - val_acc: 0.9930
Epoch 71/200
 - 1s - loss: 0.0315 - acc: 0.9913 - val_loss: 0.0265 - val_acc: 0.9930
Epoch 72/200
 - 1s - loss: 0.0305 - acc: 0.9912 - val_loss: 0.0265 - val_acc: 0.9930
Epoch 73/200
 - 1s - loss: 0.0321 - acc: 0.9909 - val_loss: 0.0264 - val_acc: 0.9930
Epoch 74/200
 - 1s - loss: 0.0309 - acc: 0.9911 - val_loss: 0.0264 - val_acc: 0.9930
Epoch 75/200
 - 1s - loss: 0.0313 - acc: 0.9909 - val_loss: 0.0264 - val_acc: 0.9930
Epoch 76/200
 - 1s - loss: 0.0308 - acc: 0.9910 - val_loss: 0.0264 - val_acc: 0.9930
Epoch 77/200
 - 1s - loss: 0.0309 - acc: 0.9906 - val_loss: 0.0263 - val_acc: 0.9930
Epoch 78/200
 - 1s - loss: 0.0308 - acc: 0.9908 - val_loss: 0.0263 - val_acc: 0.9930
Epoch 79/200
 - 1s - loss: 0.0311 - acc: 0.9906 - val_loss: 0.0263 - val_acc: 0.9930
Epoch 80/200
 - 1s - loss: 0.0301 - acc: 0.9911 - val_loss: 0.0263 - val_acc: 0.9930
Epoch 81/200
 - 1s - loss: 0.0318 - acc: 0.9904 - val_loss: 0.0263 - val_acc: 0.9930
Epoch 82/200
 - 1s - loss: 0.0307 - acc: 0.9910 - val_loss: 0.0262 - val_acc: 0.9930
Epoch 83/200
 - 1s - loss: 0.0306 - acc: 0.9909 - val_loss: 0.0262 - val_acc: 0.9930
Epoch 84/200
 - 1s - loss: 0.0303 - acc: 0.9909 - val_loss: 0.0262 - val_acc: 0.9930
Epoch 85/200
 - 1s - loss: 0.0298 - acc: 0.9914 - val_loss: 0.0262 - val_acc: 0.9930
Epoch 86/200
 - 1s - loss: 0.0312 - acc: 0.9914 - val_loss: 0.0262 - val_acc: 0.9930
Epoch 87/200
 - 1s - loss: 0.0304 - acc: 0.9915 - val_loss: 0.0261 - val_acc: 0.9930
Epoch 88/200
 - 1s - loss: 0.0310 - acc: 0.9911 - val_loss: 0.0261 - val_acc: 0.9930
Epoch 89/200
 - 1s - loss: 0.0305 - acc: 0.9913 - val_loss: 0.0261 - val_acc: 0.9930
Epoch 90/200
 - 1s - loss: 0.0303 - acc: 0.9913 - val_loss: 0.0261 - val_acc: 0.9930
Epoch 91/200
 - 1s - loss: 0.0306 - acc: 0.9909 - val_loss: 0.0261 - val_acc: 0.9930
Epoch 92/200
 - 1s - loss: 0.0307 - acc: 0.9908 - val_loss: 0.0261 - val_acc: 0.9930
Epoch 93/200
 - 1s - loss: 0.0294 - acc: 0.9922 - val_loss: 0.0260 - val_acc: 0.9930
Epoch 94/200
 - 1s - loss: 0.0298 - acc: 0.9908 - val_loss: 0.0260 - val_acc: 0.9930
Epoch 95/200
 - 1s - loss: 0.0297 - acc: 0.9912 - val_loss: 0.0260 - val_acc: 0.9930
Epoch 96/200
 - 1s - loss: 0.0304 - acc: 0.9915 - val_loss: 0.0260 - val_acc: 0.9930
Epoch 97/200
 - 1s - loss: 0.0302 - acc: 0.9910 - val_loss: 0.0260 - val_acc: 0.9930
Epoch 98/200
 - 1s - loss: 0.0311 - acc: 0.9910 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 99/200
 - 1s - loss: 0.0303 - acc: 0.9910 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 100/200
 - 1s - loss: 0.0310 - acc: 0.9911 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 101/200
 - 1s - loss: 0.0304 - acc: 0.9912 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 102/200
 - 1s - loss: 0.0300 - acc: 0.9917 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 103/200
 - 1s - loss: 0.0295 - acc: 0.9914 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 104/200
 - 1s - loss: 0.0294 - acc: 0.9920 - val_loss: 0.0259 - val_acc: 0.9930
Epoch 105/200
 - 1s - loss: 0.0299 - acc: 0.9911 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 106/200
 - 1s - loss: 0.0306 - acc: 0.9912 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 107/200
 - 1s - loss: 0.0310 - acc: 0.9904 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 108/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 109/200
 - 1s - loss: 0.0300 - acc: 0.9907 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 110/200
 - 1s - loss: 0.0307 - acc: 0.9909 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 111/200
 - 1s - loss: 0.0304 - acc: 0.9906 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 112/200
 - 1s - loss: 0.0286 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 113/200
 - 1s - loss: 0.0297 - acc: 0.9917 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 114/200
 - 1s - loss: 0.0305 - acc: 0.9908 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 115/200
 - 1s - loss: 0.0295 - acc: 0.9915 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 116/200
 - 1s - loss: 0.0300 - acc: 0.9917 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 117/200
 - 1s - loss: 0.0303 - acc: 0.9911 - val_loss: 0.0257 - val_acc: 0.9930
Epoch 118/200
 - 1s - loss: 0.0286 - acc: 0.9915 - val_loss: 0.0257 - val_acc: 0.9932
Epoch 119/200
 - 1s - loss: 0.0299 - acc: 0.9910 - val_loss: 0.0257 - val_acc: 0.9932
Epoch 120/200
 - 1s - loss: 0.0304 - acc: 0.9905 - val_loss: 0.0256 - val_acc: 0.9930
Epoch 121/200
 - 1s - loss: 0.0312 - acc: 0.9910 - val_loss: 0.0256 - val_acc: 0.9930
Epoch 122/200
 - 1s - loss: 0.0295 - acc: 0.9911 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 123/200
 - 1s - loss: 0.0300 - acc: 0.9918 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 124/200
 - 1s - loss: 0.0294 - acc: 0.9919 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 125/200
 - 1s - loss: 0.0296 - acc: 0.9908 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 126/200
 - 1s - loss: 0.0299 - acc: 0.9914 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 127/200
 - 1s - loss: 0.0284 - acc: 0.9924 - val_loss: 0.0256 - val_acc: 0.9932
Epoch 128/200
 - 1s - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 129/200
 - 1s - loss: 0.0298 - acc: 0.9914 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 130/200
 - 1s - loss: 0.0294 - acc: 0.9915 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 131/200
 - 1s - loss: 0.0305 - acc: 0.9913 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 132/200
 - 1s - loss: 0.0295 - acc: 0.9912 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 133/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 134/200
 - 1s - loss: 0.0294 - acc: 0.9915 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 135/200
 - 1s - loss: 0.0303 - acc: 0.9909 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 136/200
 - 1s - loss: 0.0297 - acc: 0.9915 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 137/200
 - 1s - loss: 0.0294 - acc: 0.9917 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 138/200
 - 1s - loss: 0.0304 - acc: 0.9910 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 139/200
 - 1s - loss: 0.0296 - acc: 0.9919 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0293 - acc: 0.9914 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0289 - acc: 0.9917 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 142/200
 - 1s - loss: 0.0298 - acc: 0.9913 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 143/200
 - 1s - loss: 0.0295 - acc: 0.9915 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 144/200
 - 1s - loss: 0.0292 - acc: 0.9915 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 145/200
 - 1s - loss: 0.0299 - acc: 0.9910 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0299 - acc: 0.9912 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0296 - acc: 0.9913 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0294 - acc: 0.9913 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0283 - acc: 0.9918 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0287 - acc: 0.9920 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0300 - acc: 0.9911 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0290 - acc: 0.9917 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0285 - acc: 0.9918 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0294 - acc: 0.9906 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0288 - acc: 0.9914 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0294 - acc: 0.9914 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0289 - acc: 0.9912 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0285 - acc: 0.9915 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0299 - acc: 0.9913 - val_loss: 0.0253 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0290 - acc: 0.9918 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0294 - acc: 0.9911 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 163/200
 - 1s - loss: 0.0303 - acc: 0.9912 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 164/200
 - 1s - loss: 0.0288 - acc: 0.9916 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 165/200
 - 1s - loss: 0.0286 - acc: 0.9918 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 166/200
 - 1s - loss: 0.0287 - acc: 0.9914 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0300 - acc: 0.9909 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 168/200
 - 1s - loss: 0.0283 - acc: 0.9917 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0292 - acc: 0.9918 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0297 - acc: 0.9912 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0288 - acc: 0.9919 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0295 - acc: 0.9911 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 173/200
 - 1s - loss: 0.0285 - acc: 0.9919 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 174/200
 - 1s - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 175/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 176/200
 - 1s - loss: 0.0292 - acc: 0.9917 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 177/200
 - 1s - loss: 0.0285 - acc: 0.9914 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 178/200
 - 1s - loss: 0.0284 - acc: 0.9920 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 179/200
 - 1s - loss: 0.0286 - acc: 0.9917 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 180/200
 - 1s - loss: 0.0285 - acc: 0.9914 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 181/200
 - 1s - loss: 0.0296 - acc: 0.9911 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 182/200
 - 1s - loss: 0.0292 - acc: 0.9915 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 183/200
 - 1s - loss: 0.0296 - acc: 0.9915 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 184/200
 - 1s - loss: 0.0281 - acc: 0.9918 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 185/200
 - 1s - loss: 0.0291 - acc: 0.9914 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 186/200
 - 1s - loss: 0.0294 - acc: 0.9910 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 187/200
 - 1s - loss: 0.0294 - acc: 0.9908 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 188/200
 - 1s - loss: 0.0291 - acc: 0.9912 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 189/200
 - 1s - loss: 0.0287 - acc: 0.9914 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 190/200
 - 1s - loss: 0.0285 - acc: 0.9914 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 191/200
 - 1s - loss: 0.0285 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 192/200
 - 1s - loss: 0.0289 - acc: 0.9917 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 193/200
 - 1s - loss: 0.0284 - acc: 0.9917 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 194/200
 - 1s - loss: 0.0294 - acc: 0.9908 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 195/200
 - 1s - loss: 0.0288 - acc: 0.9917 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 196/200
 - 1s - loss: 0.0285 - acc: 0.9918 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 197/200
 - 1s - loss: 0.0289 - acc: 0.9918 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 198/200
 - 1s - loss: 0.0289 - acc: 0.9913 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 199/200
 - 1s - loss: 0.0287 - acc: 0.9911 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 200/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0250 - val_acc: 0.9930
2018-03-27 10:17:48,476 [INFO] Evaluate...
2018-03-27 10:17:51,129 [INFO] Done!
2018-03-27 10:17:51,136 [INFO] tpe_transform took 0.002513 seconds
2018-03-27 10:17:51,136 [INFO] TPE using 31/31 trials with best loss 0.013169
2018-03-27 10:17:51,144 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:17:52,134 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0487 - acc: 0.9808 - val_loss: 0.0153 - val_acc: 0.9942
Epoch 2/200
 - 1s - loss: 0.0231 - acc: 0.9922 - val_loss: 0.0138 - val_acc: 0.9946
Epoch 3/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 4/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 5/200
 - 1s - loss: 0.0195 - acc: 0.9932 - val_loss: 0.0127 - val_acc: 0.9950
Epoch 6/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0126 - val_acc: 0.9950
Epoch 7/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0130 - val_acc: 0.9958
Epoch 8/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0124 - val_acc: 0.9960
Epoch 9/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0124 - val_acc: 0.9958
Epoch 10/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0121 - val_acc: 0.9954
Epoch 11/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0120 - val_acc: 0.9952
Epoch 12/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0120 - val_acc: 0.9956
Epoch 13/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0119 - val_acc: 0.9960
Epoch 14/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0120 - val_acc: 0.9960
Epoch 15/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0119 - val_acc: 0.9958
Epoch 16/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0119 - val_acc: 0.9960
Epoch 17/200
 - 1s - loss: 0.0173 - acc: 0.9940 - val_loss: 0.0119 - val_acc: 0.9960
Epoch 18/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0118 - val_acc: 0.9962
Epoch 19/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0118 - val_acc: 0.9960
Epoch 20/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0118 - val_acc: 0.9960
Epoch 21/200
 - 1s - loss: 0.0164 - acc: 0.9945 - val_loss: 0.0117 - val_acc: 0.9960
Epoch 22/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0117 - val_acc: 0.9958
Epoch 23/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0116 - val_acc: 0.9956
Epoch 24/200
 - 1s - loss: 0.0164 - acc: 0.9943 - val_loss: 0.0116 - val_acc: 0.9962
Epoch 25/200
 - 1s - loss: 0.0166 - acc: 0.9945 - val_loss: 0.0116 - val_acc: 0.9954
Epoch 26/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0116 - val_acc: 0.9954
Epoch 27/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0116 - val_acc: 0.9962
Epoch 28/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0115 - val_acc: 0.9962
Epoch 29/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0115 - val_acc: 0.9962
Epoch 30/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0115 - val_acc: 0.9962
Epoch 31/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0115 - val_acc: 0.9960
Epoch 32/200
 - 1s - loss: 0.0156 - acc: 0.9947 - val_loss: 0.0115 - val_acc: 0.9962
Epoch 33/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0115 - val_acc: 0.9960
Epoch 34/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0115 - val_acc: 0.9960
Epoch 35/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0115 - val_acc: 0.9960
Epoch 36/200
 - 1s - loss: 0.0153 - acc: 0.9949 - val_loss: 0.0115 - val_acc: 0.9962
Epoch 37/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0115 - val_acc: 0.9962
Epoch 38/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0114 - val_acc: 0.9960
Epoch 39/200
 - 1s - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 40/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0114 - val_acc: 0.9960
Epoch 41/200
 - 1s - loss: 0.0158 - acc: 0.9944 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 42/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 43/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 44/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 45/200
 - 1s - loss: 0.0160 - acc: 0.9945 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 46/200
 - 1s - loss: 0.0151 - acc: 0.9958 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 47/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 48/200
 - 1s - loss: 0.0151 - acc: 0.9948 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 49/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 50/200
 - 1s - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 51/200
 - 1s - loss: 0.0146 - acc: 0.9951 - val_loss: 0.0114 - val_acc: 0.9962
Epoch 52/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 53/200
 - 1s - loss: 0.0164 - acc: 0.9945 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 54/200
 - 1s - loss: 0.0147 - acc: 0.9950 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 55/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0114 - val_acc: 0.9964
Epoch 56/200
 - 1s - loss: 0.0141 - acc: 0.9953 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 57/200
 - 1s - loss: 0.0146 - acc: 0.9948 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 58/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 59/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 60/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 61/200
 - 1s - loss: 0.0148 - acc: 0.9957 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 62/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0114 - val_acc: 0.9966
Epoch 63/200
 - 1s - loss: 0.0147 - acc: 0.9951 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 64/200
 - 1s - loss: 0.0156 - acc: 0.9946 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 65/200
 - 1s - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 66/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 67/200
 - 1s - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 68/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 69/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 70/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 71/200
 - 1s - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 72/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 73/200
 - 1s - loss: 0.0145 - acc: 0.9950 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 74/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 75/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 76/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 77/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 78/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 79/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 80/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 81/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 82/200
 - 1s - loss: 0.0143 - acc: 0.9951 - val_loss: 0.0113 - val_acc: 0.9966
Epoch 83/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0113 - val_acc: 0.9968
Epoch 84/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0113 - val_acc: 0.9968
Epoch 85/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 86/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0113 - val_acc: 0.9964
Epoch 87/200
 - 1s - loss: 0.0139 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 88/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 89/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 90/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 91/200
 - 1s - loss: 0.0151 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 92/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 93/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 94/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 95/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 96/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 97/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 98/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 99/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 100/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 101/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 102/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 103/200
 - 1s - loss: 0.0146 - acc: 0.9949 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 104/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 105/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 106/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 107/200
 - 1s - loss: 0.0145 - acc: 0.9952 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 108/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 109/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 110/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 111/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 112/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 113/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 114/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 115/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 116/200
 - 1s - loss: 0.0142 - acc: 0.9952 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 117/200
 - 1s - loss: 0.0142 - acc: 0.9953 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 118/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 119/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 120/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 121/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 122/200
 - 1s - loss: 0.0140 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 123/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 124/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 125/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 126/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 127/200
 - 1s - loss: 0.0135 - acc: 0.9963 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 128/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 129/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 130/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 131/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 132/200
 - 1s - loss: 0.0141 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 133/200
 - 1s - loss: 0.0134 - acc: 0.9957 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 134/200
 - 1s - loss: 0.0147 - acc: 0.9958 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 135/200
 - 1s - loss: 0.0143 - acc: 0.9957 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 136/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9964
Epoch 137/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 138/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 139/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 140/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 141/200
 - 1s - loss: 0.0138 - acc: 0.9946 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 142/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 143/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 144/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 145/200
 - 1s - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 146/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 147/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 148/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 149/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 150/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 151/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 152/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 153/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 154/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 155/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 156/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 157/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 158/200
 - 1s - loss: 0.0142 - acc: 0.9948 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 159/200
 - 1s - loss: 0.0145 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 160/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 161/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 162/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 163/200
 - 1s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 164/200
 - 1s - loss: 0.0137 - acc: 0.9962 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 165/200
 - 1s - loss: 0.0142 - acc: 0.9950 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 166/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0112 - val_acc: 0.9966
Epoch 167/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 168/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 169/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 170/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 171/200
 - 1s - loss: 0.0128 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 172/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 173/200
 - 1s - loss: 0.0138 - acc: 0.9962 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 174/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 175/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 176/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 177/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 178/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 179/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 180/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 181/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 182/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 183/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 184/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 185/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 186/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 187/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 188/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 189/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 190/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 191/200
 - 1s - loss: 0.0146 - acc: 0.9952 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 192/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 193/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 194/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 195/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 196/200
 - 1s - loss: 0.0132 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 197/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 198/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 199/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0111 - val_acc: 0.9966
Epoch 200/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0111 - val_acc: 0.9966
2018-03-27 10:20:58,692 [INFO] Evaluate...
2018-03-27 10:21:01,407 [INFO] Done!
2018-03-27 10:21:01,414 [INFO] tpe_transform took 0.003149 seconds
2018-03-27 10:21:01,415 [INFO] TPE using 32/32 trials with best loss 0.011121
2018-03-27 10:21:01,422 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:21:02,417 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0542 - acc: 0.9777 - val_loss: 0.0209 - val_acc: 0.9926
Epoch 2/200
 - 1s - loss: 0.0295 - acc: 0.9896 - val_loss: 0.0197 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0292 - acc: 0.9895 - val_loss: 0.0191 - val_acc: 0.9936
Epoch 4/200
 - 1s - loss: 0.0283 - acc: 0.9908 - val_loss: 0.0187 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0269 - acc: 0.9904 - val_loss: 0.0185 - val_acc: 0.9940
Epoch 6/200
 - 1s - loss: 0.0254 - acc: 0.9916 - val_loss: 0.0183 - val_acc: 0.9942
Epoch 7/200
 - 1s - loss: 0.0265 - acc: 0.9911 - val_loss: 0.0182 - val_acc: 0.9942
Epoch 8/200
 - 1s - loss: 0.0271 - acc: 0.9909 - val_loss: 0.0181 - val_acc: 0.9942
Epoch 9/200
 - 1s - loss: 0.0248 - acc: 0.9914 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 10/200
 - 1s - loss: 0.0257 - acc: 0.9909 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 11/200
 - 1s - loss: 0.0270 - acc: 0.9909 - val_loss: 0.0178 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0243 - acc: 0.9915 - val_loss: 0.0178 - val_acc: 0.9942
Epoch 13/200
 - 1s - loss: 0.0244 - acc: 0.9915 - val_loss: 0.0177 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0177 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0247 - acc: 0.9915 - val_loss: 0.0176 - val_acc: 0.9942
Epoch 16/200
 - 1s - loss: 0.0236 - acc: 0.9914 - val_loss: 0.0176 - val_acc: 0.9942
Epoch 17/200
 - 1s - loss: 0.0254 - acc: 0.9914 - val_loss: 0.0176 - val_acc: 0.9942
Epoch 18/200
 - 1s - loss: 0.0242 - acc: 0.9910 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 19/200
 - 1s - loss: 0.0235 - acc: 0.9919 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 20/200
 - 1s - loss: 0.0252 - acc: 0.9914 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 21/200
 - 1s - loss: 0.0240 - acc: 0.9917 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 22/200
 - 1s - loss: 0.0243 - acc: 0.9914 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 23/200
 - 1s - loss: 0.0244 - acc: 0.9915 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 24/200
 - 1s - loss: 0.0237 - acc: 0.9918 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 25/200
 - 1s - loss: 0.0238 - acc: 0.9914 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 26/200
 - 1s - loss: 0.0240 - acc: 0.9915 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 27/200
 - 1s - loss: 0.0229 - acc: 0.9921 - val_loss: 0.0173 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0246 - acc: 0.9917 - val_loss: 0.0173 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0240 - acc: 0.9919 - val_loss: 0.0173 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0223 - acc: 0.9925 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0243 - acc: 0.9915 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0241 - acc: 0.9918 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0235 - acc: 0.9919 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0228 - acc: 0.9919 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0234 - acc: 0.9919 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0231 - acc: 0.9923 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 37/200
 - 1s - loss: 0.0226 - acc: 0.9921 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0238 - acc: 0.9918 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 39/200
 - 1s - loss: 0.0233 - acc: 0.9919 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0244 - acc: 0.9917 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0229 - acc: 0.9919 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 42/200
 - 1s - loss: 0.0235 - acc: 0.9919 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 43/200
 - 1s - loss: 0.0230 - acc: 0.9922 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 44/200
 - 1s - loss: 0.0223 - acc: 0.9923 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 45/200
 - 1s - loss: 0.0232 - acc: 0.9923 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 46/200
 - 1s - loss: 0.0231 - acc: 0.9925 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 47/200
 - 1s - loss: 0.0222 - acc: 0.9925 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 48/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0227 - acc: 0.9926 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0235 - acc: 0.9914 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 51/200
 - 1s - loss: 0.0234 - acc: 0.9923 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 52/200
 - 1s - loss: 0.0219 - acc: 0.9925 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 53/200
 - 1s - loss: 0.0239 - acc: 0.9918 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0239 - acc: 0.9921 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 56/200
 - 1s - loss: 0.0222 - acc: 0.9923 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 57/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0227 - acc: 0.9923 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 59/200
 - 1s - loss: 0.0238 - acc: 0.9916 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 60/200
 - 1s - loss: 0.0247 - acc: 0.9914 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 61/200
 - 1s - loss: 0.0238 - acc: 0.9919 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 62/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 63/200
 - 1s - loss: 0.0231 - acc: 0.9915 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 64/200
 - 1s - loss: 0.0229 - acc: 0.9921 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 65/200
 - 1s - loss: 0.0222 - acc: 0.9926 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0231 - acc: 0.9921 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 67/200
 - 1s - loss: 0.0233 - acc: 0.9919 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 68/200
 - 1s - loss: 0.0220 - acc: 0.9924 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 70/200
 - 1s - loss: 0.0229 - acc: 0.9920 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0234 - acc: 0.9919 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 72/200
 - 1s - loss: 0.0230 - acc: 0.9917 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 73/200
 - 1s - loss: 0.0225 - acc: 0.9926 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 74/200
 - 1s - loss: 0.0231 - acc: 0.9928 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 75/200
 - 1s - loss: 0.0237 - acc: 0.9917 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 76/200
 - 1s - loss: 0.0223 - acc: 0.9920 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0234 - acc: 0.9917 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 78/200
 - 1s - loss: 0.0230 - acc: 0.9917 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0228 - acc: 0.9919 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0231 - acc: 0.9922 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 81/200
 - 1s - loss: 0.0233 - acc: 0.9919 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0228 - acc: 0.9924 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0223 - acc: 0.9922 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 84/200
 - 1s - loss: 0.0223 - acc: 0.9922 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 85/200
 - 1s - loss: 0.0217 - acc: 0.9926 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 86/200
 - 1s - loss: 0.0227 - acc: 0.9926 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 87/200
 - 1s - loss: 0.0223 - acc: 0.9920 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 88/200
 - 1s - loss: 0.0230 - acc: 0.9916 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 89/200
 - 1s - loss: 0.0216 - acc: 0.9928 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 90/200
 - 1s - loss: 0.0220 - acc: 0.9922 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 91/200
 - 1s - loss: 0.0229 - acc: 0.9922 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 92/200
 - 1s - loss: 0.0241 - acc: 0.9916 - val_loss: 0.0168 - val_acc: 0.9946
Epoch 93/200
 - 1s - loss: 0.0217 - acc: 0.9925 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 94/200
 - 1s - loss: 0.0225 - acc: 0.9919 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 95/200
 - 1s - loss: 0.0230 - acc: 0.9922 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 96/200
 - 1s - loss: 0.0231 - acc: 0.9920 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 97/200
 - 1s - loss: 0.0225 - acc: 0.9918 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 98/200
 - 1s - loss: 0.0218 - acc: 0.9926 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 99/200
 - 1s - loss: 0.0229 - acc: 0.9919 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 100/200
 - 1s - loss: 0.0222 - acc: 0.9922 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 101/200
 - 1s - loss: 0.0228 - acc: 0.9921 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 102/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 103/200
 - 1s - loss: 0.0236 - acc: 0.9920 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 104/200
 - 1s - loss: 0.0223 - acc: 0.9921 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 105/200
 - 1s - loss: 0.0232 - acc: 0.9921 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 106/200
 - 1s - loss: 0.0220 - acc: 0.9926 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 107/200
 - 1s - loss: 0.0225 - acc: 0.9916 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 108/200
 - 1s - loss: 0.0224 - acc: 0.9921 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 109/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 110/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 111/200
 - 1s - loss: 0.0214 - acc: 0.9923 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 112/200
 - 1s - loss: 0.0233 - acc: 0.9917 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 113/200
 - 1s - loss: 0.0222 - acc: 0.9923 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 114/200
 - 1s - loss: 0.0231 - acc: 0.9920 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 115/200
 - 1s - loss: 0.0225 - acc: 0.9927 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 116/200
 - 1s - loss: 0.0223 - acc: 0.9919 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 117/200
 - 1s - loss: 0.0223 - acc: 0.9925 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 118/200
 - 1s - loss: 0.0219 - acc: 0.9922 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 119/200
 - 1s - loss: 0.0227 - acc: 0.9922 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 120/200
 - 1s - loss: 0.0221 - acc: 0.9919 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 121/200
 - 1s - loss: 0.0226 - acc: 0.9922 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 122/200
 - 1s - loss: 0.0227 - acc: 0.9927 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 123/200
 - 1s - loss: 0.0223 - acc: 0.9922 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 124/200
 - 1s - loss: 0.0231 - acc: 0.9916 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 125/200
 - 1s - loss: 0.0220 - acc: 0.9928 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 126/200
 - 1s - loss: 0.0224 - acc: 0.9918 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 127/200
 - 1s - loss: 0.0212 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 128/200
 - 1s - loss: 0.0224 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 129/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 130/200
 - 1s - loss: 0.0235 - acc: 0.9918 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 131/200
 - 1s - loss: 0.0234 - acc: 0.9919 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 132/200
 - 1s - loss: 0.0227 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 133/200
 - 1s - loss: 0.0227 - acc: 0.9917 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 134/200
 - 1s - loss: 0.0229 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 135/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 136/200
 - 1s - loss: 0.0230 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 137/200
 - 1s - loss: 0.0225 - acc: 0.9920 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 138/200
 - 1s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 139/200
 - 1s - loss: 0.0235 - acc: 0.9917 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 140/200
 - 1s - loss: 0.0225 - acc: 0.9927 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 141/200
 - 1s - loss: 0.0222 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 142/200
 - 1s - loss: 0.0220 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 143/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 144/200
 - 1s - loss: 0.0211 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 145/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 146/200
 - 1s - loss: 0.0216 - acc: 0.9925 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 147/200
 - 1s - loss: 0.0220 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 148/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 149/200
 - 1s - loss: 0.0216 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 150/200
 - 1s - loss: 0.0222 - acc: 0.9926 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 151/200
 - 1s - loss: 0.0226 - acc: 0.9922 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 152/200
 - 1s - loss: 0.0213 - acc: 0.9926 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 153/200
 - 1s - loss: 0.0232 - acc: 0.9918 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 154/200
 - 1s - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 155/200
 - 1s - loss: 0.0212 - acc: 0.9922 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 156/200
 - 1s - loss: 0.0219 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 157/200
 - 1s - loss: 0.0217 - acc: 0.9926 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 158/200
 - 1s - loss: 0.0224 - acc: 0.9923 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 159/200
 - 1s - loss: 0.0212 - acc: 0.9930 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 160/200
 - 1s - loss: 0.0223 - acc: 0.9919 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 161/200
 - 1s - loss: 0.0221 - acc: 0.9925 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 162/200
 - 1s - loss: 0.0228 - acc: 0.9916 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 163/200
 - 1s - loss: 0.0228 - acc: 0.9918 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 164/200
 - 1s - loss: 0.0227 - acc: 0.9917 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 165/200
 - 1s - loss: 0.0211 - acc: 0.9928 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 166/200
 - 1s - loss: 0.0212 - acc: 0.9927 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 167/200
 - 1s - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 168/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 169/200
 - 1s - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 170/200
 - 1s - loss: 0.0215 - acc: 0.9922 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 171/200
 - 1s - loss: 0.0233 - acc: 0.9919 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 172/200
 - 1s - loss: 0.0216 - acc: 0.9920 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 173/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 174/200
 - 1s - loss: 0.0224 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 175/200
 - 1s - loss: 0.0230 - acc: 0.9925 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 176/200
 - 1s - loss: 0.0226 - acc: 0.9919 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 177/200
 - 1s - loss: 0.0219 - acc: 0.9929 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 178/200
 - 1s - loss: 0.0231 - acc: 0.9921 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 179/200
 - 1s - loss: 0.0220 - acc: 0.9922 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 180/200
 - 1s - loss: 0.0230 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 181/200
 - 1s - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 182/200
 - 1s - loss: 0.0225 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 183/200
 - 1s - loss: 0.0212 - acc: 0.9927 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 184/200
 - 1s - loss: 0.0219 - acc: 0.9922 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 185/200
 - 1s - loss: 0.0222 - acc: 0.9926 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 186/200
 - 1s - loss: 0.0227 - acc: 0.9923 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 187/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 188/200
 - 1s - loss: 0.0215 - acc: 0.9928 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 189/200
 - 1s - loss: 0.0207 - acc: 0.9928 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 190/200
 - 1s - loss: 0.0217 - acc: 0.9923 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 191/200
 - 1s - loss: 0.0217 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 192/200
 - 1s - loss: 0.0234 - acc: 0.9916 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 193/200
 - 1s - loss: 0.0224 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 194/200
 - 1s - loss: 0.0225 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 195/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 196/200
 - 1s - loss: 0.0226 - acc: 0.9923 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 197/200
 - 1s - loss: 0.0230 - acc: 0.9920 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 198/200
 - 1s - loss: 0.0219 - acc: 0.9924 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 199/200
 - 1s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 200/200
 - 1s - loss: 0.0223 - acc: 0.9926 - val_loss: 0.0165 - val_acc: 0.9948
2018-03-27 10:24:09,950 [INFO] Evaluate...
2018-03-27 10:24:12,679 [INFO] Done!
2018-03-27 10:24:12,686 [INFO] tpe_transform took 0.002534 seconds
2018-03-27 10:24:12,686 [INFO] TPE using 33/33 trials with best loss 0.011121
2018-03-27 10:24:12,694 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:24:13,687 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0573 - acc: 0.9781 - val_loss: 0.0247 - val_acc: 0.9908
Epoch 2/200
 - 1s - loss: 0.0285 - acc: 0.9908 - val_loss: 0.0221 - val_acc: 0.9914
Epoch 3/200
 - 1s - loss: 0.0254 - acc: 0.9919 - val_loss: 0.0214 - val_acc: 0.9914
Epoch 4/200
 - 1s - loss: 0.0256 - acc: 0.9912 - val_loss: 0.0213 - val_acc: 0.9912
Epoch 5/200
 - 1s - loss: 0.0238 - acc: 0.9927 - val_loss: 0.0206 - val_acc: 0.9920
Epoch 6/200
 - 1s - loss: 0.0243 - acc: 0.9918 - val_loss: 0.0204 - val_acc: 0.9920
Epoch 7/200
 - 1s - loss: 0.0244 - acc: 0.9921 - val_loss: 0.0204 - val_acc: 0.9922
Epoch 8/200
 - 1s - loss: 0.0249 - acc: 0.9919 - val_loss: 0.0200 - val_acc: 0.9918
Epoch 9/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0199 - val_acc: 0.9918
Epoch 10/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0199 - val_acc: 0.9918
Epoch 11/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0199 - val_acc: 0.9918
Epoch 12/200
 - 1s - loss: 0.0230 - acc: 0.9923 - val_loss: 0.0197 - val_acc: 0.9918
Epoch 13/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0197 - val_acc: 0.9922
Epoch 14/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9918
Epoch 15/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9920
Epoch 16/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0194 - val_acc: 0.9918
Epoch 17/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0194 - val_acc: 0.9918
Epoch 18/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0194 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0192 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0191 - val_acc: 0.9922
Epoch 22/200
 - 1s - loss: 0.0223 - acc: 0.9927 - val_loss: 0.0192 - val_acc: 0.9922
Epoch 23/200
 - 1s - loss: 0.0212 - acc: 0.9926 - val_loss: 0.0191 - val_acc: 0.9922
Epoch 24/200
 - 1s - loss: 0.0220 - acc: 0.9927 - val_loss: 0.0191 - val_acc: 0.9922
Epoch 25/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9922
Epoch 26/200
 - 1s - loss: 0.0215 - acc: 0.9927 - val_loss: 0.0190 - val_acc: 0.9922
Epoch 27/200
 - 1s - loss: 0.0203 - acc: 0.9929 - val_loss: 0.0190 - val_acc: 0.9922
Epoch 28/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0189 - val_acc: 0.9922
Epoch 29/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0190 - val_acc: 0.9922
Epoch 30/200
 - 1s - loss: 0.0221 - acc: 0.9922 - val_loss: 0.0190 - val_acc: 0.9924
Epoch 31/200
 - 1s - loss: 0.0200 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9924
Epoch 32/200
 - 1s - loss: 0.0219 - acc: 0.9929 - val_loss: 0.0188 - val_acc: 0.9922
Epoch 33/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0188 - val_acc: 0.9922
Epoch 34/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0188 - val_acc: 0.9922
Epoch 35/200
 - 1s - loss: 0.0206 - acc: 0.9927 - val_loss: 0.0187 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9922
Epoch 37/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0187 - val_acc: 0.9922
Epoch 38/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0187 - val_acc: 0.9922
Epoch 39/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0187 - val_acc: 0.9924
Epoch 40/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0187 - val_acc: 0.9924
Epoch 41/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0187 - val_acc: 0.9922
Epoch 42/200
 - 1s - loss: 0.0212 - acc: 0.9930 - val_loss: 0.0187 - val_acc: 0.9922
Epoch 43/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0186 - val_acc: 0.9922
Epoch 44/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0186 - val_acc: 0.9922
Epoch 45/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0186 - val_acc: 0.9922
Epoch 46/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0186 - val_acc: 0.9922
Epoch 47/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0186 - val_acc: 0.9922
Epoch 48/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0186 - val_acc: 0.9924
Epoch 49/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9924
Epoch 50/200
 - 1s - loss: 0.0205 - acc: 0.9926 - val_loss: 0.0185 - val_acc: 0.9924
Epoch 51/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0185 - val_acc: 0.9922
Epoch 52/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0185 - val_acc: 0.9922
Epoch 53/200
 - 1s - loss: 0.0197 - acc: 0.9932 - val_loss: 0.0185 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0185 - val_acc: 0.9924
Epoch 55/200
 - 1s - loss: 0.0201 - acc: 0.9932 - val_loss: 0.0185 - val_acc: 0.9922
Epoch 56/200
 - 1s - loss: 0.0214 - acc: 0.9929 - val_loss: 0.0185 - val_acc: 0.9922
Epoch 57/200
 - 1s - loss: 0.0198 - acc: 0.9934 - val_loss: 0.0184 - val_acc: 0.9922
Epoch 58/200
 - 1s - loss: 0.0191 - acc: 0.9935 - val_loss: 0.0184 - val_acc: 0.9922
Epoch 59/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 60/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 61/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 62/200
 - 1s - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 63/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 64/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 65/200
 - 1s - loss: 0.0197 - acc: 0.9933 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 66/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 67/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 68/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0184 - val_acc: 0.9924
Epoch 69/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0183 - val_acc: 0.9924
Epoch 70/200
 - 1s - loss: 0.0201 - acc: 0.9933 - val_loss: 0.0183 - val_acc: 0.9924
Epoch 71/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9924
Epoch 72/200
 - 1s - loss: 0.0193 - acc: 0.9936 - val_loss: 0.0183 - val_acc: 0.9922
Epoch 73/200
 - 1s - loss: 0.0193 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9922
Epoch 74/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0183 - val_acc: 0.9924
Epoch 75/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0183 - val_acc: 0.9924
Epoch 76/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0183 - val_acc: 0.9924
Epoch 77/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 78/200
 - 1s - loss: 0.0204 - acc: 0.9931 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 79/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0182 - val_acc: 0.9922
Epoch 80/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 81/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 82/200
 - 1s - loss: 0.0195 - acc: 0.9935 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 83/200
 - 1s - loss: 0.0197 - acc: 0.9936 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 84/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 85/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 86/200
 - 1s - loss: 0.0193 - acc: 0.9930 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 87/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 88/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 89/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 90/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 91/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0182 - val_acc: 0.9924
Epoch 92/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9924
2018-03-27 10:25:45,190 [INFO] Evaluate...
2018-03-27 10:25:47,941 [INFO] Done!
2018-03-27 10:25:47,947 [INFO] tpe_transform took 0.002476 seconds
2018-03-27 10:25:47,948 [INFO] TPE using 34/34 trials with best loss 0.011121
2018-03-27 10:25:47,955 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:25:48,944 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0487 - acc: 0.9804 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 2/200
 - 1s - loss: 0.0237 - acc: 0.9918 - val_loss: 0.0169 - val_acc: 0.9934
Epoch 3/200
 - 1s - loss: 0.0205 - acc: 0.9930 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 4/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0156 - val_acc: 0.9946
Epoch 6/200
 - 1s - loss: 0.0175 - acc: 0.9939 - val_loss: 0.0155 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0153 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0152 - val_acc: 0.9944
Epoch 9/200
 - 1s - loss: 0.0175 - acc: 0.9940 - val_loss: 0.0151 - val_acc: 0.9948
Epoch 10/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0149 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0162 - acc: 0.9945 - val_loss: 0.0148 - val_acc: 0.9944
Epoch 12/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0148 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0148 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0147 - val_acc: 0.9950
Epoch 15/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 16/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0146 - val_acc: 0.9954
Epoch 18/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0146 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0146 - val_acc: 0.9950
Epoch 20/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0145 - val_acc: 0.9950
Epoch 21/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0146 - val_acc: 0.9948
Epoch 22/200
 - 1s - loss: 0.0158 - acc: 0.9946 - val_loss: 0.0146 - val_acc: 0.9954
Epoch 23/200
 - 1s - loss: 0.0156 - acc: 0.9949 - val_loss: 0.0145 - val_acc: 0.9948
Epoch 24/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0145 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0145 - val_acc: 0.9948
Epoch 26/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0144 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0151 - acc: 0.9945 - val_loss: 0.0144 - val_acc: 0.9948
Epoch 28/200
 - 1s - loss: 0.0158 - acc: 0.9946 - val_loss: 0.0144 - val_acc: 0.9948
Epoch 29/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 30/200
 - 1s - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 31/200
 - 1s - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 33/200
 - 1s - loss: 0.0152 - acc: 0.9949 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 34/200
 - 1s - loss: 0.0151 - acc: 0.9952 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0143 - acc: 0.9950 - val_loss: 0.0143 - val_acc: 0.9952
Epoch 36/200
 - 1s - loss: 0.0148 - acc: 0.9946 - val_loss: 0.0143 - val_acc: 0.9952
Epoch 37/200
 - 1s - loss: 0.0151 - acc: 0.9950 - val_loss: 0.0143 - val_acc: 0.9952
Epoch 38/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 39/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 40/200
 - 1s - loss: 0.0148 - acc: 0.9953 - val_loss: 0.0143 - val_acc: 0.9950
Epoch 41/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0143 - val_acc: 0.9948
Epoch 42/200
 - 1s - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0142 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0149 - acc: 0.9948 - val_loss: 0.0142 - val_acc: 0.9952
Epoch 44/200
 - 1s - loss: 0.0150 - acc: 0.9952 - val_loss: 0.0142 - val_acc: 0.9948
Epoch 45/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0142 - val_acc: 0.9952
Epoch 46/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 47/200
 - 1s - loss: 0.0136 - acc: 0.9953 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 48/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 49/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 50/200
 - 1s - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 51/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 52/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 53/200
 - 1s - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 54/200
 - 1s - loss: 0.0137 - acc: 0.9947 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 55/200
 - 1s - loss: 0.0154 - acc: 0.9952 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 56/200
 - 1s - loss: 0.0139 - acc: 0.9952 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 57/200
 - 1s - loss: 0.0147 - acc: 0.9951 - val_loss: 0.0142 - val_acc: 0.9950
Epoch 58/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 59/200
 - 1s - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 60/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 61/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 62/200
 - 1s - loss: 0.0140 - acc: 0.9954 - val_loss: 0.0141 - val_acc: 0.9948
Epoch 63/200
 - 1s - loss: 0.0138 - acc: 0.9951 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 64/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 65/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 66/200
 - 1s - loss: 0.0140 - acc: 0.9951 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 67/200
 - 1s - loss: 0.0136 - acc: 0.9953 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 68/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 69/200
 - 1s - loss: 0.0131 - acc: 0.9956 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 70/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 71/200
 - 1s - loss: 0.0142 - acc: 0.9949 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 72/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 73/200
 - 1s - loss: 0.0136 - acc: 0.9954 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 74/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 75/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0141 - val_acc: 0.9950
Epoch 76/200
 - 1s - loss: 0.0139 - acc: 0.9957 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 77/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 78/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 79/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 80/200
 - 1s - loss: 0.0141 - acc: 0.9952 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 81/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 82/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 83/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 84/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 85/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 86/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 87/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 88/200
 - 1s - loss: 0.0136 - acc: 0.9963 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 89/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 90/200
 - 1s - loss: 0.0139 - acc: 0.9951 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 91/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 92/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 93/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 94/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 95/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 96/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 97/200
 - 1s - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 98/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 99/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 100/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 101/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 102/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 103/200
 - 1s - loss: 0.0129 - acc: 0.9958 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 104/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 105/200
 - 1s - loss: 0.0129 - acc: 0.9957 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 106/200
 - 1s - loss: 0.0127 - acc: 0.9964 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 107/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 108/200
 - 1s - loss: 0.0129 - acc: 0.9956 - val_loss: 0.0140 - val_acc: 0.9950
Epoch 109/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 110/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 111/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 112/200
 - 1s - loss: 0.0133 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 113/200
 - 1s - loss: 0.0132 - acc: 0.9954 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 114/200
 - 1s - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 115/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 116/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 117/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 118/200
 - 1s - loss: 0.0133 - acc: 0.9954 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 119/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 120/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 121/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 122/200
 - 1s - loss: 0.0146 - acc: 0.9948 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 123/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 124/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 125/200
 - 1s - loss: 0.0129 - acc: 0.9957 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 126/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 127/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 128/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 129/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 130/200
 - 1s - loss: 0.0131 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 131/200
 - 1s - loss: 0.0130 - acc: 0.9957 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 132/200
 - 1s - loss: 0.0137 - acc: 0.9950 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 133/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 134/200
 - 1s - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 135/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 136/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 137/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 138/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 139/200
 - 1s - loss: 0.0131 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 140/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 141/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 142/200
 - 1s - loss: 0.0138 - acc: 0.9952 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 143/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 144/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 145/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 146/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 147/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 148/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 149/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 150/200
 - 1s - loss: 0.0134 - acc: 0.9954 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 151/200
 - 1s - loss: 0.0125 - acc: 0.9961 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 152/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 153/200
 - 1s - loss: 0.0124 - acc: 0.9965 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 154/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 155/200
 - 1s - loss: 0.0134 - acc: 0.9954 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 156/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 157/200
 - 1s - loss: 0.0134 - acc: 0.9954 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 158/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 159/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 160/200
 - 1s - loss: 0.0130 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 161/200
 - 1s - loss: 0.0139 - acc: 0.9951 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 162/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 163/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 164/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 165/200
 - 1s - loss: 0.0129 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 166/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 167/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 168/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 169/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 170/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 171/200
 - 1s - loss: 0.0130 - acc: 0.9956 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 172/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 173/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 174/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 175/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 176/200
 - 1s - loss: 0.0134 - acc: 0.9952 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 177/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 178/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 179/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 180/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 181/200
 - 1s - loss: 0.0122 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 182/200
 - 1s - loss: 0.0127 - acc: 0.9963 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 183/200
 - 1s - loss: 0.0130 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 184/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 185/200
 - 1s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 186/200
 - 1s - loss: 0.0132 - acc: 0.9956 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 187/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 188/200
 - 1s - loss: 0.0130 - acc: 0.9955 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 189/200
 - 1s - loss: 0.0127 - acc: 0.9956 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 190/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 191/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 192/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 193/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 194/200
 - 1s - loss: 0.0132 - acc: 0.9956 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 195/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 196/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 197/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 198/200
 - 1s - loss: 0.0125 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 199/200
 - 1s - loss: 0.0133 - acc: 0.9951 - val_loss: 0.0138 - val_acc: 0.9950
Epoch 200/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0138 - val_acc: 0.9950
2018-03-27 10:28:55,528 [INFO] Evaluate...
2018-03-27 10:28:58,295 [INFO] Done!
2018-03-27 10:28:58,788 [INFO] tpe_transform took 0.488425 seconds
2018-03-27 10:28:58,789 [INFO] TPE using 35/35 trials with best loss 0.011121
2018-03-27 10:28:58,797 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:28:59,783 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0537 - acc: 0.9795 - val_loss: 0.0247 - val_acc: 0.9914
Epoch 2/200
 - 1s - loss: 0.0280 - acc: 0.9902 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 3/200
 - 1s - loss: 0.0266 - acc: 0.9909 - val_loss: 0.0222 - val_acc: 0.9930
Epoch 4/200
 - 1s - loss: 0.0246 - acc: 0.9913 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 5/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 6/200
 - 1s - loss: 0.0234 - acc: 0.9921 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 7/200
 - 1s - loss: 0.0225 - acc: 0.9922 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 8/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0209 - val_acc: 0.9934
Epoch 9/200
 - 1s - loss: 0.0219 - acc: 0.9926 - val_loss: 0.0207 - val_acc: 0.9938
Epoch 10/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0207 - val_acc: 0.9938
Epoch 11/200
 - 1s - loss: 0.0213 - acc: 0.9930 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9938
Epoch 13/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0205 - val_acc: 0.9936
Epoch 14/200
 - 1s - loss: 0.0219 - acc: 0.9928 - val_loss: 0.0205 - val_acc: 0.9938
Epoch 15/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 16/200
 - 1s - loss: 0.0216 - acc: 0.9928 - val_loss: 0.0202 - val_acc: 0.9940
Epoch 17/200
 - 1s - loss: 0.0218 - acc: 0.9928 - val_loss: 0.0202 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0202 - val_acc: 0.9940
Epoch 19/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0201 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0201 - acc: 0.9934 - val_loss: 0.0201 - val_acc: 0.9940
Epoch 21/200
 - 1s - loss: 0.0210 - acc: 0.9924 - val_loss: 0.0200 - val_acc: 0.9940
Epoch 22/200
 - 1s - loss: 0.0207 - acc: 0.9930 - val_loss: 0.0200 - val_acc: 0.9940
Epoch 23/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9940
Epoch 25/200
 - 1s - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0199 - val_acc: 0.9940
Epoch 26/200
 - 1s - loss: 0.0188 - acc: 0.9934 - val_loss: 0.0199 - val_acc: 0.9940
Epoch 27/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9940
Epoch 28/200
 - 1s - loss: 0.0204 - acc: 0.9934 - val_loss: 0.0199 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 31/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0201 - acc: 0.9929 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 34/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0195 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 41/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9936
Epoch 42/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 43/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 45/200
 - 1s - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 46/200
 - 1s - loss: 0.0197 - acc: 0.9934 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 47/200
 - 1s - loss: 0.0200 - acc: 0.9932 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 48/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 49/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 50/200
 - 1s - loss: 0.0194 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9936
Epoch 51/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 52/200
 - 1s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 53/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 54/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 55/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9936
Epoch 56/200
 - 1s - loss: 0.0204 - acc: 0.9928 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 57/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0193 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9938
Epoch 59/200
 - 1s - loss: 0.0197 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 60/200
 - 1s - loss: 0.0191 - acc: 0.9934 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 61/200
 - 1s - loss: 0.0198 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 64/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 67/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 68/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 69/200
 - 1s - loss: 0.0193 - acc: 0.9929 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 70/200
 - 1s - loss: 0.0192 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 71/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 72/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 73/200
 - 1s - loss: 0.0192 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 74/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 75/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 76/200
 - 1s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 77/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 78/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 79/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 80/200
 - 1s - loss: 0.0191 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 81/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 82/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 83/200
 - 1s - loss: 0.0192 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 84/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 85/200
 - 1s - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 86/200
 - 1s - loss: 0.0185 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 87/200
 - 1s - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 88/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 89/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 90/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 91/200
 - 1s - loss: 0.0193 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 92/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 93/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 94/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 95/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 96/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 97/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 98/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 99/200
 - 1s - loss: 0.0186 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 100/200
 - 1s - loss: 0.0185 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 101/200
 - 1s - loss: 0.0199 - acc: 0.9928 - val_loss: 0.0193 - val_acc: 0.9942
Epoch 102/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 103/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 104/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 105/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 106/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 107/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 108/200
 - 1s - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 109/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 110/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 111/200
 - 1s - loss: 0.0196 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 112/200
 - 1s - loss: 0.0190 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 113/200
 - 1s - loss: 0.0186 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 114/200
 - 1s - loss: 0.0177 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 115/200
 - 1s - loss: 0.0182 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 116/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 117/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 118/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 119/200
 - 1s - loss: 0.0189 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 120/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 121/200
 - 1s - loss: 0.0175 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 122/200
 - 1s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 123/200
 - 1s - loss: 0.0184 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 124/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 125/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 126/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 127/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 128/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 129/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 130/200
 - 1s - loss: 0.0187 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 131/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 132/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 133/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9942
Epoch 134/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 135/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 136/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 137/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 138/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 139/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 140/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 141/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 142/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 143/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 144/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 145/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 146/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 147/200
 - 1s - loss: 0.0191 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 148/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 149/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 150/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 151/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 152/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 153/200
 - 1s - loss: 0.0188 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 154/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 155/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 156/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 157/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 158/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 159/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9942
2018-03-27 10:31:30,563 [INFO] Evaluate...
2018-03-27 10:31:33,397 [INFO] Done!
2018-03-27 10:31:33,403 [INFO] tpe_transform took 0.002492 seconds
2018-03-27 10:31:33,404 [INFO] TPE using 36/36 trials with best loss 0.011121
2018-03-27 10:31:33,412 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:31:34,400 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0539 - acc: 0.9804 - val_loss: 0.0264 - val_acc: 0.9912
Epoch 2/200
 - 1s - loss: 0.0243 - acc: 0.9918 - val_loss: 0.0240 - val_acc: 0.9924
Epoch 3/200
 - 1s - loss: 0.0226 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9926
Epoch 4/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 5/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0222 - val_acc: 0.9924
Epoch 6/200
 - 1s - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0218 - val_acc: 0.9926
Epoch 7/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9930
Epoch 10/200
 - 1s - loss: 0.0191 - acc: 0.9941 - val_loss: 0.0213 - val_acc: 0.9926
Epoch 11/200
 - 1s - loss: 0.0186 - acc: 0.9943 - val_loss: 0.0212 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 13/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9928
Epoch 14/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0209 - val_acc: 0.9930
Epoch 15/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0208 - val_acc: 0.9930
Epoch 16/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 17/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 18/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 19/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0207 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0168 - acc: 0.9948 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 21/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 22/200
 - 1s - loss: 0.0170 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9928
Epoch 23/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 24/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 25/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 26/200
 - 1s - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 27/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0204 - val_acc: 0.9932
Epoch 28/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 29/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 30/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 31/200
 - 1s - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 32/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 33/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 34/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 35/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 36/200
 - 1s - loss: 0.0162 - acc: 0.9956 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 37/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 38/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 39/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0202 - val_acc: 0.9934
Epoch 40/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 41/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 42/200
 - 1s - loss: 0.0163 - acc: 0.9943 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 43/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 44/200
 - 1s - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 45/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 46/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0201 - val_acc: 0.9934
Epoch 47/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 48/200
 - 1s - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 49/200
 - 1s - loss: 0.0161 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 50/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 51/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 52/200
 - 1s - loss: 0.0160 - acc: 0.9945 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 53/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 54/200
 - 1s - loss: 0.0156 - acc: 0.9947 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 55/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 56/200
 - 1s - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 57/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 58/200
 - 1s - loss: 0.0153 - acc: 0.9953 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 59/200
 - 1s - loss: 0.0152 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 60/200
 - 1s - loss: 0.0155 - acc: 0.9949 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 61/200
 - 1s - loss: 0.0153 - acc: 0.9951 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 62/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 64/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 67/200
 - 1s - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 68/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 70/200
 - 1s - loss: 0.0158 - acc: 0.9946 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 71/200
 - 1s - loss: 0.0155 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 72/200
 - 1s - loss: 0.0151 - acc: 0.9958 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 73/200
 - 1s - loss: 0.0157 - acc: 0.9946 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 74/200
 - 1s - loss: 0.0143 - acc: 0.9959 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 75/200
 - 1s - loss: 0.0150 - acc: 0.9962 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 76/200
 - 1s - loss: 0.0151 - acc: 0.9959 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 77/200
 - 1s - loss: 0.0151 - acc: 0.9960 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 78/200
 - 1s - loss: 0.0153 - acc: 0.9952 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 79/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 80/200
 - 1s - loss: 0.0147 - acc: 0.9952 - val_loss: 0.0198 - val_acc: 0.9934
Epoch 81/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 82/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 83/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 84/200
 - 1s - loss: 0.0152 - acc: 0.9948 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 85/200
 - 1s - loss: 0.0147 - acc: 0.9958 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 86/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 87/200
 - 1s - loss: 0.0151 - acc: 0.9956 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 88/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 89/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 90/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 91/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 92/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 93/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 94/200
 - 1s - loss: 0.0146 - acc: 0.9950 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 95/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 96/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 97/200
 - 1s - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 98/200
 - 1s - loss: 0.0151 - acc: 0.9951 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 99/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 100/200
 - 1s - loss: 0.0148 - acc: 0.9957 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 101/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 102/200
 - 1s - loss: 0.0148 - acc: 0.9958 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 103/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 104/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 105/200
 - 1s - loss: 0.0149 - acc: 0.9947 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 106/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 107/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 108/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 109/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 110/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 111/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 112/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 113/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 114/200
 - 1s - loss: 0.0146 - acc: 0.9958 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 115/200
 - 1s - loss: 0.0149 - acc: 0.9958 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 116/200
 - 1s - loss: 0.0146 - acc: 0.9958 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 117/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 118/200
 - 1s - loss: 0.0149 - acc: 0.9957 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 119/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0196 - val_acc: 0.9932
Epoch 120/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 121/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 122/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 123/200
 - 1s - loss: 0.0149 - acc: 0.9958 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 124/200
 - 1s - loss: 0.0150 - acc: 0.9956 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 125/200
 - 1s - loss: 0.0147 - acc: 0.9958 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 126/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 127/200
 - 1s - loss: 0.0140 - acc: 0.9961 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 128/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 129/200
 - 1s - loss: 0.0142 - acc: 0.9953 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 130/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 131/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 132/200
 - 1s - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 133/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 134/200
 - 1s - loss: 0.0141 - acc: 0.9957 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 135/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 136/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 137/200
 - 1s - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 138/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 139/200
 - 1s - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 142/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 143/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 144/200
 - 1s - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 145/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0148 - acc: 0.9958 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0139 - acc: 0.9964 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0149 - acc: 0.9957 - val_loss: 0.0195 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0144 - acc: 0.9959 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0143 - acc: 0.9952 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 163/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0151 - acc: 0.9959 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0144 - acc: 0.9952 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0146 - acc: 0.9950 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 175/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 176/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 177/200
 - 1s - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 178/200
 - 1s - loss: 0.0148 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 179/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 180/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 181/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 182/200
 - 1s - loss: 0.0151 - acc: 0.9949 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 183/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 184/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 185/200
 - 1s - loss: 0.0140 - acc: 0.9960 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 186/200
 - 1s - loss: 0.0150 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 187/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 188/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 189/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 190/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 191/200
 - 1s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 192/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 193/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 194/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 195/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 196/200
 - 1s - loss: 0.0144 - acc: 0.9960 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 197/200
 - 1s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 198/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0194 - val_acc: 0.9932
Epoch 199/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 200/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0193 - val_acc: 0.9930
2018-03-27 10:34:42,333 [INFO] Evaluate...
2018-03-27 10:34:45,236 [INFO] Done!
2018-03-27 10:34:45,243 [INFO] tpe_transform took 0.002460 seconds
2018-03-27 10:34:45,244 [INFO] TPE using 37/37 trials with best loss 0.011121
2018-03-27 10:34:45,252 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:34:46,242 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0491 - acc: 0.9811 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 2/200
 - 1s - loss: 0.0232 - acc: 0.9914 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 4/200
 - 1s - loss: 0.0170 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 5/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0153 - acc: 0.9944 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 7/200
 - 1s - loss: 0.0153 - acc: 0.9949 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 8/200
 - 1s - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 9/200
 - 1s - loss: 0.0142 - acc: 0.9949 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0143 - acc: 0.9949 - val_loss: 0.0187 - val_acc: 0.9950
Epoch 11/200
 - 1s - loss: 0.0148 - acc: 0.9947 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 12/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 13/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 14/200
 - 1s - loss: 0.0146 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9950
Epoch 15/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0132 - acc: 0.9956 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 17/200
 - 1s - loss: 0.0134 - acc: 0.9963 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 18/200
 - 1s - loss: 0.0124 - acc: 0.9959 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 19/200
 - 1s - loss: 0.0124 - acc: 0.9961 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 20/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 21/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 22/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 23/200
 - 1s - loss: 0.0128 - acc: 0.9955 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 24/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0125 - acc: 0.9961 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0122 - acc: 0.9959 - val_loss: 0.0181 - val_acc: 0.9950
Epoch 28/200
 - 1s - loss: 0.0121 - acc: 0.9956 - val_loss: 0.0180 - val_acc: 0.9952
Epoch 29/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 30/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 31/200
 - 1s - loss: 0.0118 - acc: 0.9959 - val_loss: 0.0180 - val_acc: 0.9950
Epoch 32/200
 - 1s - loss: 0.0119 - acc: 0.9960 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 33/200
 - 1s - loss: 0.0118 - acc: 0.9962 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 34/200
 - 1s - loss: 0.0120 - acc: 0.9958 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 35/200
 - 1s - loss: 0.0126 - acc: 0.9956 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 36/200
 - 1s - loss: 0.0119 - acc: 0.9960 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 37/200
 - 1s - loss: 0.0118 - acc: 0.9958 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 38/200
 - 1s - loss: 0.0113 - acc: 0.9966 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 39/200
 - 1s - loss: 0.0120 - acc: 0.9958 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 40/200
 - 1s - loss: 0.0112 - acc: 0.9969 - val_loss: 0.0178 - val_acc: 0.9952
Epoch 41/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0178 - val_acc: 0.9952
Epoch 42/200
 - 1s - loss: 0.0120 - acc: 0.9960 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 43/200
 - 1s - loss: 0.0114 - acc: 0.9967 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 44/200
 - 1s - loss: 0.0121 - acc: 0.9960 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 45/200
 - 1s - loss: 0.0117 - acc: 0.9959 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 46/200
 - 1s - loss: 0.0123 - acc: 0.9958 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 47/200
 - 1s - loss: 0.0122 - acc: 0.9957 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 48/200
 - 1s - loss: 0.0115 - acc: 0.9961 - val_loss: 0.0178 - val_acc: 0.9950
2018-03-27 10:35:39,804 [INFO] Evaluate...
2018-03-27 10:35:42,802 [INFO] Done!
2018-03-27 10:35:42,809 [INFO] tpe_transform took 0.002558 seconds
2018-03-27 10:35:42,809 [INFO] TPE using 38/38 trials with best loss 0.011121
2018-03-27 10:35:42,816 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:35:43,818 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0466 - acc: 0.9825 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 2/200
 - 1s - loss: 0.0269 - acc: 0.9909 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 3/200
 - 1s - loss: 0.0233 - acc: 0.9922 - val_loss: 0.0189 - val_acc: 0.9938
Epoch 4/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 5/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 6/200
 - 1s - loss: 0.0199 - acc: 0.9931 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 7/200
 - 1s - loss: 0.0207 - acc: 0.9930 - val_loss: 0.0179 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0195 - acc: 0.9933 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 9/200
 - 1s - loss: 0.0210 - acc: 0.9928 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 10/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 11/200
 - 1s - loss: 0.0194 - acc: 0.9929 - val_loss: 0.0176 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 13/200
 - 1s - loss: 0.0196 - acc: 0.9931 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 14/200
 - 1s - loss: 0.0197 - acc: 0.9936 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 15/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0208 - acc: 0.9932 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 18/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0197 - acc: 0.9932 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0200 - acc: 0.9931 - val_loss: 0.0168 - val_acc: 0.9944
Epoch 21/200
 - 1s - loss: 0.0198 - acc: 0.9931 - val_loss: 0.0168 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 23/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0167 - val_acc: 0.9946
Epoch 24/200
 - 1s - loss: 0.0205 - acc: 0.9929 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0198 - acc: 0.9930 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0191 - acc: 0.9935 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0204 - acc: 0.9931 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 28/200
 - 1s - loss: 0.0198 - acc: 0.9930 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0166 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9946
Epoch 31/200
 - 1s - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0198 - acc: 0.9931 - val_loss: 0.0165 - val_acc: 0.9946
Epoch 33/200
 - 1s - loss: 0.0194 - acc: 0.9934 - val_loss: 0.0165 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0182 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9946
Epoch 35/200
 - 1s - loss: 0.0188 - acc: 0.9931 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0187 - acc: 0.9935 - val_loss: 0.0165 - val_acc: 0.9946
Epoch 37/200
 - 1s - loss: 0.0193 - acc: 0.9931 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 38/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 39/200
 - 1s - loss: 0.0188 - acc: 0.9932 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 40/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 41/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 42/200
 - 1s - loss: 0.0186 - acc: 0.9937 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 43/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 44/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 45/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 46/200
 - 1s - loss: 0.0195 - acc: 0.9936 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 47/200
 - 1s - loss: 0.0183 - acc: 0.9932 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 48/200
 - 1s - loss: 0.0195 - acc: 0.9931 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 51/200
 - 1s - loss: 0.0191 - acc: 0.9933 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 52/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 53/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 56/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 57/200
 - 1s - loss: 0.0185 - acc: 0.9932 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 59/200
 - 1s - loss: 0.0182 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 60/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 61/200
 - 1s - loss: 0.0186 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 62/200
 - 1s - loss: 0.0195 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 63/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 64/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 65/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0181 - acc: 0.9936 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 67/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 68/200
 - 1s - loss: 0.0174 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 70/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 72/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 73/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 74/200
 - 1s - loss: 0.0175 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 75/200
 - 1s - loss: 0.0179 - acc: 0.9938 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 76/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0167 - acc: 0.9945 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 78/200
 - 1s - loss: 0.0180 - acc: 0.9935 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0186 - acc: 0.9937 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0186 - acc: 0.9935 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 81/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0192 - acc: 0.9932 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 84/200
 - 1s - loss: 0.0175 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 85/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 86/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 87/200
 - 1s - loss: 0.0178 - acc: 0.9937 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 88/200
 - 1s - loss: 0.0182 - acc: 0.9938 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 89/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 90/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 91/200
 - 1s - loss: 0.0183 - acc: 0.9931 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 92/200
 - 1s - loss: 0.0185 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 93/200
 - 1s - loss: 0.0177 - acc: 0.9943 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 94/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 95/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 96/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 97/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 98/200
 - 1s - loss: 0.0172 - acc: 0.9938 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 99/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 100/200
 - 1s - loss: 0.0163 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 101/200
 - 1s - loss: 0.0173 - acc: 0.9938 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 102/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 103/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 104/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 105/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 106/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 107/200
 - 1s - loss: 0.0166 - acc: 0.9944 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 108/200
 - 1s - loss: 0.0180 - acc: 0.9939 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 109/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 110/200
 - 1s - loss: 0.0179 - acc: 0.9936 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 111/200
 - 1s - loss: 0.0178 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 112/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 113/200
 - 1s - loss: 0.0183 - acc: 0.9939 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 114/200
 - 1s - loss: 0.0167 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 115/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 116/200
 - 1s - loss: 0.0176 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 117/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 118/200
 - 1s - loss: 0.0178 - acc: 0.9937 - val_loss: 0.0158 - val_acc: 0.9946
Epoch 119/200
 - 1s - loss: 0.0176 - acc: 0.9939 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 120/200
 - 1s - loss: 0.0173 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 121/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 122/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 123/200
 - 1s - loss: 0.0182 - acc: 0.9936 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 124/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 125/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 126/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 127/200
 - 1s - loss: 0.0173 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 128/200
 - 1s - loss: 0.0180 - acc: 0.9938 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 129/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 130/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 131/200
 - 1s - loss: 0.0180 - acc: 0.9934 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 132/200
 - 1s - loss: 0.0166 - acc: 0.9940 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 133/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 134/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9946
Epoch 135/200
 - 1s - loss: 0.0182 - acc: 0.9936 - val_loss: 0.0158 - val_acc: 0.9946
Epoch 136/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 137/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 138/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 139/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 140/200
 - 1s - loss: 0.0180 - acc: 0.9938 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 141/200
 - 1s - loss: 0.0179 - acc: 0.9937 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 142/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 143/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 144/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 145/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 146/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 147/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 148/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 149/200
 - 1s - loss: 0.0176 - acc: 0.9937 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 150/200
 - 1s - loss: 0.0172 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 151/200
 - 1s - loss: 0.0174 - acc: 0.9938 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 152/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 153/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 154/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 155/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 156/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 157/200
 - 1s - loss: 0.0177 - acc: 0.9938 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 158/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 159/200
 - 1s - loss: 0.0177 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 160/200
 - 1s - loss: 0.0168 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 161/200
 - 1s - loss: 0.0168 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 162/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 163/200
 - 1s - loss: 0.0174 - acc: 0.9938 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 164/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 165/200
 - 1s - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 166/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 167/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 168/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 169/200
 - 1s - loss: 0.0179 - acc: 0.9933 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 170/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 171/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 172/200
 - 1s - loss: 0.0164 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 173/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 174/200
 - 1s - loss: 0.0172 - acc: 0.9939 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 175/200
 - 1s - loss: 0.0170 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 176/200
 - 1s - loss: 0.0179 - acc: 0.9937 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 177/200
 - 1s - loss: 0.0165 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 178/200
 - 1s - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 179/200
 - 1s - loss: 0.0157 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 180/200
 - 1s - loss: 0.0167 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 181/200
 - 1s - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 182/200
 - 1s - loss: 0.0169 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 183/200
 - 1s - loss: 0.0175 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 184/200
 - 1s - loss: 0.0174 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 185/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 186/200
 - 1s - loss: 0.0171 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 187/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 188/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 189/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 190/200
 - 1s - loss: 0.0166 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 191/200
 - 1s - loss: 0.0170 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 192/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 193/200
 - 1s - loss: 0.0166 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 194/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 195/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 196/200
 - 1s - loss: 0.0168 - acc: 0.9939 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 197/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 198/200
 - 1s - loss: 0.0180 - acc: 0.9939 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 199/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 200/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0156 - val_acc: 0.9948
2018-03-27 10:38:51,653 [INFO] Evaluate...
2018-03-27 10:38:54,630 [INFO] Done!
2018-03-27 10:38:54,636 [INFO] tpe_transform took 0.002485 seconds
2018-03-27 10:38:54,637 [INFO] TPE using 39/39 trials with best loss 0.011121
2018-03-27 10:38:54,645 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:38:55,632 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0809 - acc: 0.9760 - val_loss: 0.0346 - val_acc: 0.9914
Epoch 2/200
 - 1s - loss: 0.0352 - acc: 0.9901 - val_loss: 0.0302 - val_acc: 0.9926
Epoch 3/200
 - 1s - loss: 0.0316 - acc: 0.9910 - val_loss: 0.0282 - val_acc: 0.9934
Epoch 4/200
 - 1s - loss: 0.0301 - acc: 0.9918 - val_loss: 0.0271 - val_acc: 0.9934
Epoch 5/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0263 - val_acc: 0.9938
Epoch 6/200
 - 1s - loss: 0.0282 - acc: 0.9924 - val_loss: 0.0257 - val_acc: 0.9938
Epoch 7/200
 - 1s - loss: 0.0276 - acc: 0.9923 - val_loss: 0.0253 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0269 - acc: 0.9921 - val_loss: 0.0250 - val_acc: 0.9940
Epoch 9/200
 - 1s - loss: 0.0268 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9940
Epoch 10/200
 - 1s - loss: 0.0264 - acc: 0.9926 - val_loss: 0.0244 - val_acc: 0.9942
Epoch 11/200
 - 1s - loss: 0.0260 - acc: 0.9928 - val_loss: 0.0242 - val_acc: 0.9944
Epoch 12/200
 - 1s - loss: 0.0254 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0254 - acc: 0.9925 - val_loss: 0.0237 - val_acc: 0.9946
Epoch 15/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0236 - val_acc: 0.9946
Epoch 16/200
 - 1s - loss: 0.0247 - acc: 0.9930 - val_loss: 0.0235 - val_acc: 0.9946
Epoch 17/200
 - 1s - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0234 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0241 - acc: 0.9937 - val_loss: 0.0233 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0244 - acc: 0.9933 - val_loss: 0.0232 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9944
Epoch 21/200
 - 1s - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0229 - val_acc: 0.9944
Epoch 23/200
 - 1s - loss: 0.0234 - acc: 0.9940 - val_loss: 0.0229 - val_acc: 0.9944
Epoch 24/200
 - 1s - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0237 - acc: 0.9936 - val_loss: 0.0227 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0231 - acc: 0.9939 - val_loss: 0.0226 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0237 - acc: 0.9937 - val_loss: 0.0226 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0242 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0234 - acc: 0.9930 - val_loss: 0.0225 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0224 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0235 - acc: 0.9937 - val_loss: 0.0224 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0232 - acc: 0.9937 - val_loss: 0.0223 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0227 - acc: 0.9939 - val_loss: 0.0222 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0222 - val_acc: 0.9944
Epoch 37/200
 - 1s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.0222 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0226 - acc: 0.9940 - val_loss: 0.0221 - val_acc: 0.9944
Epoch 39/200
 - 1s - loss: 0.0226 - acc: 0.9941 - val_loss: 0.0221 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0221 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0232 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0220 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0220 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0220 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0228 - acc: 0.9936 - val_loss: 0.0219 - val_acc: 0.9944
Epoch 47/200
 - 1s - loss: 0.0232 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9944
Epoch 48/200
 - 1s - loss: 0.0225 - acc: 0.9940 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 50/200
 - 1s - loss: 0.0228 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 51/200
 - 1s - loss: 0.0222 - acc: 0.9941 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 52/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 53/200
 - 1s - loss: 0.0227 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9944
Epoch 54/200
 - 1s - loss: 0.0226 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9944
Epoch 55/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9944
Epoch 56/200
 - 1s - loss: 0.0224 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9944
Epoch 57/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 58/200
 - 1s - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 59/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 60/200
 - 1s - loss: 0.0223 - acc: 0.9934 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 61/200
 - 1s - loss: 0.0223 - acc: 0.9939 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 62/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 63/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 64/200
 - 1s - loss: 0.0221 - acc: 0.9941 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 65/200
 - 1s - loss: 0.0222 - acc: 0.9941 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 66/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 67/200
 - 1s - loss: 0.0221 - acc: 0.9938 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 68/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 69/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0227 - acc: 0.9940 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 71/200
 - 1s - loss: 0.0223 - acc: 0.9938 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0222 - acc: 0.9939 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 74/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 75/200
 - 1s - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9944
Epoch 76/200
 - 1s - loss: 0.0220 - acc: 0.9938 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 77/200
 - 1s - loss: 0.0218 - acc: 0.9939 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 78/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 79/200
 - 1s - loss: 0.0220 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 80/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 81/200
 - 1s - loss: 0.0221 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 82/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 83/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 84/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 85/200
 - 1s - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 86/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 87/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 88/200
 - 1s - loss: 0.0219 - acc: 0.9940 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 89/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 90/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 91/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9944
Epoch 92/200
 - 1s - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 93/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 94/200
 - 1s - loss: 0.0219 - acc: 0.9941 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 95/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 96/200
 - 1s - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 97/200
 - 1s - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 98/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 99/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 100/200
 - 1s - loss: 0.0219 - acc: 0.9940 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 101/200
 - 1s - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 102/200
 - 1s - loss: 0.0216 - acc: 0.9942 - val_loss: 0.0211 - val_acc: 0.9942
Epoch 103/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 104/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 105/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 106/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 107/200
 - 1s - loss: 0.0211 - acc: 0.9944 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 108/200
 - 1s - loss: 0.0215 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 109/200
 - 1s - loss: 0.0215 - acc: 0.9944 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 110/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 111/200
 - 1s - loss: 0.0218 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 112/200
 - 1s - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 113/200
 - 1s - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 114/200
 - 1s - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 115/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 116/200
 - 1s - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 117/200
 - 1s - loss: 0.0215 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 118/200
 - 1s - loss: 0.0215 - acc: 0.9941 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 119/200
 - 1s - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 120/200
 - 1s - loss: 0.0212 - acc: 0.9942 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 121/200
 - 1s - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 122/200
 - 1s - loss: 0.0210 - acc: 0.9945 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 123/200
 - 1s - loss: 0.0208 - acc: 0.9941 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 124/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 125/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 126/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 127/200
 - 1s - loss: 0.0212 - acc: 0.9942 - val_loss: 0.0209 - val_acc: 0.9942
Epoch 128/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 129/200
 - 1s - loss: 0.0211 - acc: 0.9944 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 130/200
 - 1s - loss: 0.0213 - acc: 0.9942 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 131/200
 - 1s - loss: 0.0209 - acc: 0.9942 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 132/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 133/200
 - 1s - loss: 0.0214 - acc: 0.9942 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 134/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 135/200
 - 1s - loss: 0.0214 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 136/200
 - 1s - loss: 0.0209 - acc: 0.9944 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 137/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 138/200
 - 1s - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 139/200
 - 1s - loss: 0.0214 - acc: 0.9946 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 140/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 141/200
 - 1s - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 142/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 143/200
 - 1s - loss: 0.0210 - acc: 0.9944 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 144/200
 - 1s - loss: 0.0214 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9942
Epoch 145/200
 - 1s - loss: 0.0207 - acc: 0.9943 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 146/200
 - 1s - loss: 0.0217 - acc: 0.9944 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 147/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 148/200
 - 1s - loss: 0.0206 - acc: 0.9942 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 149/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 150/200
 - 1s - loss: 0.0207 - acc: 0.9951 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 151/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 152/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 153/200
 - 1s - loss: 0.0211 - acc: 0.9943 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 154/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 155/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 156/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 157/200
 - 1s - loss: 0.0209 - acc: 0.9944 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 158/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 159/200
 - 1s - loss: 0.0211 - acc: 0.9941 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 160/200
 - 1s - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 161/200
 - 1s - loss: 0.0204 - acc: 0.9945 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 162/200
 - 1s - loss: 0.0204 - acc: 0.9946 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 163/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 164/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 165/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 166/200
 - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 167/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 168/200
 - 1s - loss: 0.0206 - acc: 0.9945 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 169/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 170/200
 - 1s - loss: 0.0207 - acc: 0.9946 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 171/200
 - 1s - loss: 0.0211 - acc: 0.9944 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 172/200
 - 1s - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 173/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 174/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 175/200
 - 1s - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 176/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 177/200
 - 1s - loss: 0.0205 - acc: 0.9942 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 178/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 179/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 180/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 181/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 182/200
 - 1s - loss: 0.0207 - acc: 0.9942 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 183/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 184/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0206 - val_acc: 0.9942
Epoch 185/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 186/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 187/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 188/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 189/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 190/200
 - 1s - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 191/200
 - 1s - loss: 0.0207 - acc: 0.9944 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 192/200
 - 1s - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 193/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 194/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 195/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 196/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 197/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 198/200
 - 1s - loss: 0.0213 - acc: 0.9943 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 199/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 200/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9942
2018-03-27 10:42:05,534 [INFO] Evaluate...
2018-03-27 10:42:08,591 [INFO] Done!
2018-03-27 10:42:08,599 [INFO] tpe_transform took 0.003273 seconds
2018-03-27 10:42:08,599 [INFO] TPE using 40/40 trials with best loss 0.011121
2018-03-27 10:42:08,607 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:42:09,611 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0505 - acc: 0.9791 - val_loss: 0.0219 - val_acc: 0.9924
Epoch 2/200
 - 1s - loss: 0.0214 - acc: 0.9927 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 3/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9934
Epoch 4/200
 - 1s - loss: 0.0178 - acc: 0.9941 - val_loss: 0.0186 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0171 - acc: 0.9942 - val_loss: 0.0186 - val_acc: 0.9930
Epoch 6/200
 - 1s - loss: 0.0163 - acc: 0.9945 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 7/200
 - 1s - loss: 0.0160 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0156 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9936
Epoch 9/200
 - 1s - loss: 0.0153 - acc: 0.9948 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 10/200
 - 1s - loss: 0.0151 - acc: 0.9944 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 11/200
 - 1s - loss: 0.0151 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 13/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 14/200
 - 1s - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 15/200
 - 1s - loss: 0.0143 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 16/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 19/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 21/200
 - 1s - loss: 0.0136 - acc: 0.9952 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9942
Epoch 23/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 24/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9942
Epoch 26/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9942
Epoch 27/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 28/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0168 - val_acc: 0.9942
Epoch 29/200
 - 1s - loss: 0.0131 - acc: 0.9957 - val_loss: 0.0168 - val_acc: 0.9942
Epoch 30/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9942
Epoch 31/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 34/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 35/200
 - 1s - loss: 0.0127 - acc: 0.9962 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 36/200
 - 1s - loss: 0.0129 - acc: 0.9963 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 37/200
 - 1s - loss: 0.0130 - acc: 0.9956 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 38/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 39/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 40/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 41/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 42/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 43/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 44/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 45/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 46/200
 - 1s - loss: 0.0120 - acc: 0.9960 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 47/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 48/200
 - 1s - loss: 0.0124 - acc: 0.9958 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 49/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 50/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 51/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 52/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 53/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 54/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 55/200
 - 1s - loss: 0.0122 - acc: 0.9960 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 56/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 57/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 58/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 59/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 60/200
 - 1s - loss: 0.0117 - acc: 0.9966 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 61/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 62/200
 - 1s - loss: 0.0120 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 63/200
 - 1s - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 64/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 65/200
 - 1s - loss: 0.0117 - acc: 0.9966 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 66/200
 - 1s - loss: 0.0119 - acc: 0.9962 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 67/200
 - 1s - loss: 0.0118 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 68/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 69/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 71/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0115 - acc: 0.9967 - val_loss: 0.0164 - val_acc: 0.9942
Epoch 73/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 74/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0164 - val_acc: 0.9942
Epoch 75/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 76/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9944
2018-03-27 10:43:28,416 [INFO] Evaluate...
2018-03-27 10:43:31,485 [INFO] Done!
2018-03-27 10:43:31,492 [INFO] tpe_transform took 0.002641 seconds
2018-03-27 10:43:31,493 [INFO] TPE using 41/41 trials with best loss 0.011121
2018-03-27 10:43:31,501 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:43:32,489 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0534 - acc: 0.9781 - val_loss: 0.0221 - val_acc: 0.9926
Epoch 2/200
 - 1s - loss: 0.0263 - acc: 0.9909 - val_loss: 0.0209 - val_acc: 0.9928
Epoch 3/200
 - 1s - loss: 0.0237 - acc: 0.9919 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 4/200
 - 1s - loss: 0.0225 - acc: 0.9926 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0233 - acc: 0.9919 - val_loss: 0.0196 - val_acc: 0.9938
Epoch 6/200
 - 1s - loss: 0.0211 - acc: 0.9928 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 7/200
 - 1s - loss: 0.0219 - acc: 0.9924 - val_loss: 0.0193 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0220 - acc: 0.9928 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 10/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0189 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0216 - acc: 0.9930 - val_loss: 0.0188 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0188 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9938
Epoch 14/200
 - 1s - loss: 0.0207 - acc: 0.9929 - val_loss: 0.0186 - val_acc: 0.9940
Epoch 15/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9940
Epoch 16/200
 - 1s - loss: 0.0214 - acc: 0.9925 - val_loss: 0.0185 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 19/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0183 - val_acc: 0.9940
Epoch 21/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9940
Epoch 22/200
 - 1s - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0183 - val_acc: 0.9942
Epoch 23/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0183 - val_acc: 0.9940
Epoch 24/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9940
Epoch 25/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0182 - val_acc: 0.9940
Epoch 26/200
 - 1s - loss: 0.0196 - acc: 0.9933 - val_loss: 0.0181 - val_acc: 0.9940
Epoch 27/200
 - 1s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0200 - acc: 0.9934 - val_loss: 0.0181 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0208 - acc: 0.9927 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 30/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 31/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0191 - acc: 0.9934 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 33/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 34/200
 - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 35/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0205 - acc: 0.9929 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 37/200
 - 1s - loss: 0.0180 - acc: 0.9938 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 38/200
 - 1s - loss: 0.0199 - acc: 0.9929 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 39/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 40/200
 - 1s - loss: 0.0199 - acc: 0.9936 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 41/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 42/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 43/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0196 - acc: 0.9934 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 45/200
 - 1s - loss: 0.0195 - acc: 0.9932 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 47/200
 - 1s - loss: 0.0195 - acc: 0.9939 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 48/200
 - 1s - loss: 0.0187 - acc: 0.9933 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 49/200
 - 1s - loss: 0.0186 - acc: 0.9935 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0183 - acc: 0.9936 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 51/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 52/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 53/200
 - 1s - loss: 0.0190 - acc: 0.9935 - val_loss: 0.0176 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0176 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0179 - acc: 0.9943 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 56/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 57/200
 - 1s - loss: 0.0188 - acc: 0.9939 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 58/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 59/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 60/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 61/200
 - 1s - loss: 0.0193 - acc: 0.9939 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 62/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 63/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 64/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0187 - acc: 0.9936 - val_loss: 0.0175 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9940
Epoch 67/200
 - 1s - loss: 0.0186 - acc: 0.9937 - val_loss: 0.0175 - val_acc: 0.9940
Epoch 68/200
 - 1s - loss: 0.0185 - acc: 0.9932 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 69/200
 - 1s - loss: 0.0186 - acc: 0.9936 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 70/200
 - 1s - loss: 0.0187 - acc: 0.9936 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 71/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9940
Epoch 72/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 73/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 74/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 75/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 76/200
 - 1s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 77/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 78/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 79/200
 - 1s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 80/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 81/200
 - 1s - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 82/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 83/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 84/200
 - 1s - loss: 0.0177 - acc: 0.9941 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 85/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 86/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 87/200
 - 1s - loss: 0.0191 - acc: 0.9932 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 88/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 89/200
 - 1s - loss: 0.0188 - acc: 0.9935 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 90/200
 - 1s - loss: 0.0185 - acc: 0.9937 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 91/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 92/200
 - 1s - loss: 0.0182 - acc: 0.9937 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 93/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 94/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 95/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 96/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 97/200
 - 1s - loss: 0.0198 - acc: 0.9934 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 98/200
 - 1s - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 99/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 100/200
 - 1s - loss: 0.0184 - acc: 0.9936 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 101/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 102/200
 - 1s - loss: 0.0193 - acc: 0.9936 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 103/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 104/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 105/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 106/200
 - 1s - loss: 0.0186 - acc: 0.9939 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 107/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 108/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 109/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 110/200
 - 1s - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 111/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 112/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 113/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 114/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 115/200
 - 1s - loss: 0.0181 - acc: 0.9948 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 116/200
 - 1s - loss: 0.0180 - acc: 0.9939 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 117/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 118/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 119/200
 - 1s - loss: 0.0182 - acc: 0.9939 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 120/200
 - 1s - loss: 0.0188 - acc: 0.9939 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 121/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 122/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 123/200
 - 1s - loss: 0.0170 - acc: 0.9942 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 124/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 125/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 126/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 127/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 128/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 129/200
 - 1s - loss: 0.0177 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 130/200
 - 1s - loss: 0.0177 - acc: 0.9938 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 131/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 132/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 133/200
 - 1s - loss: 0.0176 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 134/200
 - 1s - loss: 0.0183 - acc: 0.9938 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 135/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 136/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 137/200
 - 1s - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 138/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 139/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 140/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 141/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 142/200
 - 1s - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 143/200
 - 1s - loss: 0.0183 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 144/200
 - 1s - loss: 0.0171 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 145/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 146/200
 - 1s - loss: 0.0179 - acc: 0.9939 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 147/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 148/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 149/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 150/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 151/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 152/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 153/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 154/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 155/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 156/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 157/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 158/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 159/200
 - 1s - loss: 0.0189 - acc: 0.9929 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 160/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 161/200
 - 1s - loss: 0.0184 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 162/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 163/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 164/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 165/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 166/200
 - 1s - loss: 0.0180 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 167/200
 - 1s - loss: 0.0186 - acc: 0.9936 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 168/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 169/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 170/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 171/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 172/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 173/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 174/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 175/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 176/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 177/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 178/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 179/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 180/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 181/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 182/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 183/200
 - 1s - loss: 0.0185 - acc: 0.9935 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 184/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 185/200
 - 1s - loss: 0.0179 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 186/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 187/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 188/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 189/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 190/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 191/200
 - 1s - loss: 0.0174 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 192/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 193/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 194/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 195/200
 - 1s - loss: 0.0185 - acc: 0.9939 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 196/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 197/200
 - 1s - loss: 0.0171 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 198/200
 - 1s - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 199/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 200/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9944
2018-03-27 10:46:42,900 [INFO] Evaluate...
2018-03-27 10:46:46,008 [INFO] Done!
2018-03-27 10:46:46,014 [INFO] tpe_transform took 0.002471 seconds
2018-03-27 10:46:46,015 [INFO] TPE using 42/42 trials with best loss 0.011121
2018-03-27 10:46:46,023 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:46:47,011 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0532 - acc: 0.9786 - val_loss: 0.0216 - val_acc: 0.9940
Epoch 2/200
 - 1s - loss: 0.0267 - acc: 0.9915 - val_loss: 0.0199 - val_acc: 0.9942
Epoch 3/200
 - 1s - loss: 0.0252 - acc: 0.9919 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 4/200
 - 1s - loss: 0.0245 - acc: 0.9922 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 5/200
 - 1s - loss: 0.0240 - acc: 0.9922 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 6/200
 - 1s - loss: 0.0237 - acc: 0.9927 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 7/200
 - 1s - loss: 0.0238 - acc: 0.9923 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 8/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 9/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 10/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 12/200
 - 1s - loss: 0.0226 - acc: 0.9927 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 13/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 14/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 15/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0217 - acc: 0.9934 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 18/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0221 - acc: 0.9928 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 20/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 21/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0173 - val_acc: 0.9952
Epoch 22/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0172 - val_acc: 0.9952
Epoch 23/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 24/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 25/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0217 - acc: 0.9928 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 28/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 29/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 30/200
 - 1s - loss: 0.0218 - acc: 0.9928 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 31/200
 - 1s - loss: 0.0211 - acc: 0.9933 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 32/200
 - 1s - loss: 0.0216 - acc: 0.9928 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 33/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 34/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 35/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 36/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 37/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 38/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 39/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 40/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 41/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 42/200
 - 1s - loss: 0.0208 - acc: 0.9932 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 43/200
 - 1s - loss: 0.0215 - acc: 0.9930 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 44/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 45/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 46/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 47/200
 - 1s - loss: 0.0208 - acc: 0.9936 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 48/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 49/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 50/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 51/200
 - 1s - loss: 0.0214 - acc: 0.9930 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 52/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 53/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 54/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 55/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 56/200
 - 1s - loss: 0.0211 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 57/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 58/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 59/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 60/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 61/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 62/200
 - 1s - loss: 0.0208 - acc: 0.9936 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 63/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 64/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 65/200
 - 1s - loss: 0.0207 - acc: 0.9929 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 66/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 67/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 68/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 69/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 70/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 71/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 72/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 73/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 74/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 75/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 76/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 77/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 78/200
 - 1s - loss: 0.0205 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 79/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 80/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 81/200
 - 1s - loss: 0.0204 - acc: 0.9934 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 82/200
 - 1s - loss: 0.0205 - acc: 0.9934 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 83/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 84/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 85/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 86/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 87/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 88/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 89/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 90/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 91/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 92/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 93/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 94/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 95/200
 - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0164 - val_acc: 0.9952
Epoch 96/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 97/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 98/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 99/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 100/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 101/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 102/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 103/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 104/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 105/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 106/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 107/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 108/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 109/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 110/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 111/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 112/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 113/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 114/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 115/200
 - 1s - loss: 0.0205 - acc: 0.9934 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 116/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 117/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9952
Epoch 118/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 119/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 120/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 121/200
 - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 122/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 123/200
 - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 124/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 125/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 126/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 127/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 128/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 129/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 130/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 131/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 132/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 133/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 134/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 135/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 136/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 137/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 138/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 139/200
 - 1s - loss: 0.0200 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 140/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 141/200
 - 1s - loss: 0.0203 - acc: 0.9941 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 142/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 143/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 144/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 145/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 146/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 147/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9954
Epoch 148/200
 - 1s - loss: 0.0200 - acc: 0.9936 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 149/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 150/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 151/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 152/200
 - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 153/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 154/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 155/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9954
Epoch 156/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 157/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 158/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 159/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 160/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 161/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 162/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 163/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 164/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 165/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 166/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 167/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 168/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 169/200
 - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 170/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 171/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 172/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 173/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 174/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 175/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 176/200
 - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 177/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 178/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 179/200
 - 1s - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 180/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 181/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 182/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 183/200
 - 1s - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 184/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 185/200
 - 1s - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 186/200
 - 1s - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 187/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 188/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 189/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 190/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 191/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 192/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 193/200
 - 1s - loss: 0.0200 - acc: 0.9934 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 194/200
 - 1s - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 195/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 196/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 197/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 198/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 199/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0160 - val_acc: 0.9952
Epoch 200/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9952
2018-03-27 10:49:57,578 [INFO] Evaluate...
2018-03-27 10:50:00,731 [INFO] Done!
2018-03-27 10:50:00,738 [INFO] tpe_transform took 0.003325 seconds
2018-03-27 10:50:00,739 [INFO] TPE using 43/43 trials with best loss 0.011121
2018-03-27 10:50:00,747 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:50:01,761 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.1379 - acc: 0.9644 - val_loss: 0.0667 - val_acc: 0.9876
Epoch 2/200
 - 1s - loss: 0.0664 - acc: 0.9845 - val_loss: 0.0525 - val_acc: 0.9888
Epoch 3/200
 - 1s - loss: 0.0563 - acc: 0.9870 - val_loss: 0.0470 - val_acc: 0.9896
Epoch 4/200
 - 1s - loss: 0.0516 - acc: 0.9877 - val_loss: 0.0438 - val_acc: 0.9902
Epoch 5/200
 - 1s - loss: 0.0493 - acc: 0.9872 - val_loss: 0.0417 - val_acc: 0.9906
Epoch 6/200
 - 1s - loss: 0.0472 - acc: 0.9893 - val_loss: 0.0402 - val_acc: 0.9910
Epoch 7/200
 - 1s - loss: 0.0449 - acc: 0.9897 - val_loss: 0.0390 - val_acc: 0.9908
Epoch 8/200
 - 1s - loss: 0.0440 - acc: 0.9886 - val_loss: 0.0381 - val_acc: 0.9908
Epoch 9/200
 - 1s - loss: 0.0439 - acc: 0.9887 - val_loss: 0.0373 - val_acc: 0.9910
Epoch 10/200
 - 1s - loss: 0.0422 - acc: 0.9890 - val_loss: 0.0367 - val_acc: 0.9914
Epoch 11/200
 - 1s - loss: 0.0428 - acc: 0.9890 - val_loss: 0.0361 - val_acc: 0.9916
Epoch 12/200
 - 1s - loss: 0.0406 - acc: 0.9898 - val_loss: 0.0356 - val_acc: 0.9918
Epoch 13/200
 - 1s - loss: 0.0416 - acc: 0.9886 - val_loss: 0.0352 - val_acc: 0.9918
Epoch 14/200
 - 1s - loss: 0.0403 - acc: 0.9900 - val_loss: 0.0348 - val_acc: 0.9918
Epoch 15/200
 - 1s - loss: 0.0395 - acc: 0.9900 - val_loss: 0.0345 - val_acc: 0.9918
Epoch 16/200
 - 1s - loss: 0.0404 - acc: 0.9886 - val_loss: 0.0341 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0396 - acc: 0.9897 - val_loss: 0.0339 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0394 - acc: 0.9892 - val_loss: 0.0336 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0386 - acc: 0.9899 - val_loss: 0.0334 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0384 - acc: 0.9895 - val_loss: 0.0331 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0383 - acc: 0.9906 - val_loss: 0.0329 - val_acc: 0.9920
Epoch 22/200
 - 1s - loss: 0.0385 - acc: 0.9905 - val_loss: 0.0327 - val_acc: 0.9920
Epoch 23/200
 - 1s - loss: 0.0382 - acc: 0.9895 - val_loss: 0.0325 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0374 - acc: 0.9905 - val_loss: 0.0324 - val_acc: 0.9920
Epoch 25/200
 - 1s - loss: 0.0373 - acc: 0.9904 - val_loss: 0.0322 - val_acc: 0.9920
Epoch 26/200
 - 1s - loss: 0.0370 - acc: 0.9906 - val_loss: 0.0320 - val_acc: 0.9920
Epoch 27/200
 - 1s - loss: 0.0376 - acc: 0.9909 - val_loss: 0.0319 - val_acc: 0.9920
Epoch 28/200
 - 1s - loss: 0.0367 - acc: 0.9898 - val_loss: 0.0318 - val_acc: 0.9920
Epoch 29/200
 - 1s - loss: 0.0377 - acc: 0.9904 - val_loss: 0.0316 - val_acc: 0.9920
Epoch 30/200
 - 1s - loss: 0.0365 - acc: 0.9899 - val_loss: 0.0315 - val_acc: 0.9920
Epoch 31/200
 - 1s - loss: 0.0366 - acc: 0.9909 - val_loss: 0.0314 - val_acc: 0.9920
Epoch 32/200
 - 1s - loss: 0.0365 - acc: 0.9901 - val_loss: 0.0313 - val_acc: 0.9920
Epoch 33/200
 - 1s - loss: 0.0364 - acc: 0.9904 - val_loss: 0.0312 - val_acc: 0.9920
Epoch 34/200
 - 1s - loss: 0.0361 - acc: 0.9909 - val_loss: 0.0311 - val_acc: 0.9920
Epoch 35/200
 - 1s - loss: 0.0363 - acc: 0.9908 - val_loss: 0.0310 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0361 - acc: 0.9904 - val_loss: 0.0309 - val_acc: 0.9922
Epoch 37/200
 - 1s - loss: 0.0357 - acc: 0.9906 - val_loss: 0.0308 - val_acc: 0.9922
Epoch 38/200
 - 1s - loss: 0.0357 - acc: 0.9908 - val_loss: 0.0307 - val_acc: 0.9924
Epoch 39/200
 - 1s - loss: 0.0361 - acc: 0.9902 - val_loss: 0.0306 - val_acc: 0.9924
Epoch 40/200
 - 1s - loss: 0.0356 - acc: 0.9913 - val_loss: 0.0305 - val_acc: 0.9924
Epoch 41/200
 - 1s - loss: 0.0359 - acc: 0.9906 - val_loss: 0.0304 - val_acc: 0.9924
Epoch 42/200
 - 1s - loss: 0.0353 - acc: 0.9906 - val_loss: 0.0304 - val_acc: 0.9924
Epoch 43/200
 - 1s - loss: 0.0350 - acc: 0.9906 - val_loss: 0.0303 - val_acc: 0.9924
Epoch 44/200
 - 1s - loss: 0.0356 - acc: 0.9905 - val_loss: 0.0302 - val_acc: 0.9924
Epoch 45/200
 - 1s - loss: 0.0348 - acc: 0.9911 - val_loss: 0.0301 - val_acc: 0.9924
Epoch 46/200
 - 1s - loss: 0.0351 - acc: 0.9911 - val_loss: 0.0301 - val_acc: 0.9924
Epoch 47/200
 - 1s - loss: 0.0352 - acc: 0.9904 - val_loss: 0.0300 - val_acc: 0.9924
Epoch 48/200
 - 1s - loss: 0.0350 - acc: 0.9915 - val_loss: 0.0300 - val_acc: 0.9924
Epoch 49/200
 - 1s - loss: 0.0344 - acc: 0.9911 - val_loss: 0.0299 - val_acc: 0.9924
Epoch 50/200
 - 1s - loss: 0.0352 - acc: 0.9905 - val_loss: 0.0298 - val_acc: 0.9924
Epoch 51/200
 - 1s - loss: 0.0347 - acc: 0.9914 - val_loss: 0.0298 - val_acc: 0.9924
Epoch 52/200
 - 1s - loss: 0.0340 - acc: 0.9913 - val_loss: 0.0297 - val_acc: 0.9924
Epoch 53/200
 - 1s - loss: 0.0349 - acc: 0.9896 - val_loss: 0.0296 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0346 - acc: 0.9910 - val_loss: 0.0296 - val_acc: 0.9922
Epoch 55/200
 - 1s - loss: 0.0354 - acc: 0.9908 - val_loss: 0.0295 - val_acc: 0.9922
Epoch 56/200
 - 1s - loss: 0.0341 - acc: 0.9907 - val_loss: 0.0295 - val_acc: 0.9922
Epoch 57/200
 - 1s - loss: 0.0340 - acc: 0.9911 - val_loss: 0.0294 - val_acc: 0.9922
Epoch 58/200
 - 1s - loss: 0.0342 - acc: 0.9900 - val_loss: 0.0294 - val_acc: 0.9922
Epoch 59/200
 - 1s - loss: 0.0343 - acc: 0.9908 - val_loss: 0.0293 - val_acc: 0.9922
Epoch 60/200
 - 1s - loss: 0.0340 - acc: 0.9912 - val_loss: 0.0293 - val_acc: 0.9924
Epoch 61/200
 - 1s - loss: 0.0335 - acc: 0.9906 - val_loss: 0.0292 - val_acc: 0.9924
Epoch 62/200
 - 1s - loss: 0.0348 - acc: 0.9905 - val_loss: 0.0292 - val_acc: 0.9924
Epoch 63/200
 - 1s - loss: 0.0342 - acc: 0.9915 - val_loss: 0.0292 - val_acc: 0.9924
Epoch 64/200
 - 1s - loss: 0.0339 - acc: 0.9909 - val_loss: 0.0291 - val_acc: 0.9926
Epoch 65/200
 - 1s - loss: 0.0336 - acc: 0.9906 - val_loss: 0.0291 - val_acc: 0.9924
Epoch 66/200
 - 1s - loss: 0.0352 - acc: 0.9904 - val_loss: 0.0290 - val_acc: 0.9924
Epoch 67/200
 - 1s - loss: 0.0337 - acc: 0.9908 - val_loss: 0.0290 - val_acc: 0.9926
Epoch 68/200
 - 1s - loss: 0.0343 - acc: 0.9905 - val_loss: 0.0290 - val_acc: 0.9926
Epoch 69/200
 - 1s - loss: 0.0350 - acc: 0.9897 - val_loss: 0.0289 - val_acc: 0.9926
Epoch 70/200
 - 1s - loss: 0.0348 - acc: 0.9909 - val_loss: 0.0289 - val_acc: 0.9926
Epoch 71/200
 - 1s - loss: 0.0340 - acc: 0.9908 - val_loss: 0.0288 - val_acc: 0.9926
Epoch 72/200
 - 1s - loss: 0.0342 - acc: 0.9905 - val_loss: 0.0288 - val_acc: 0.9926
Epoch 73/200
 - 1s - loss: 0.0339 - acc: 0.9909 - val_loss: 0.0288 - val_acc: 0.9926
Epoch 74/200
 - 1s - loss: 0.0342 - acc: 0.9910 - val_loss: 0.0287 - val_acc: 0.9926
Epoch 75/200
 - 1s - loss: 0.0339 - acc: 0.9912 - val_loss: 0.0287 - val_acc: 0.9926
Epoch 76/200
 - 1s - loss: 0.0334 - acc: 0.9912 - val_loss: 0.0287 - val_acc: 0.9926
Epoch 77/200
 - 1s - loss: 0.0329 - acc: 0.9914 - val_loss: 0.0286 - val_acc: 0.9926
Epoch 78/200
 - 1s - loss: 0.0334 - acc: 0.9914 - val_loss: 0.0286 - val_acc: 0.9926
Epoch 79/200
 - 1s - loss: 0.0332 - acc: 0.9914 - val_loss: 0.0286 - val_acc: 0.9926
Epoch 80/200
 - 1s - loss: 0.0339 - acc: 0.9908 - val_loss: 0.0285 - val_acc: 0.9926
Epoch 81/200
 - 1s - loss: 0.0334 - acc: 0.9909 - val_loss: 0.0285 - val_acc: 0.9926
Epoch 82/200
 - 1s - loss: 0.0328 - acc: 0.9912 - val_loss: 0.0285 - val_acc: 0.9926
Epoch 83/200
 - 1s - loss: 0.0330 - acc: 0.9909 - val_loss: 0.0284 - val_acc: 0.9926
Epoch 84/200
 - 1s - loss: 0.0327 - acc: 0.9910 - val_loss: 0.0284 - val_acc: 0.9926
Epoch 85/200
 - 1s - loss: 0.0334 - acc: 0.9911 - val_loss: 0.0284 - val_acc: 0.9926
Epoch 86/200
 - 1s - loss: 0.0334 - acc: 0.9918 - val_loss: 0.0284 - val_acc: 0.9926
Epoch 87/200
 - 1s - loss: 0.0336 - acc: 0.9912 - val_loss: 0.0283 - val_acc: 0.9926
Epoch 88/200
 - 1s - loss: 0.0336 - acc: 0.9910 - val_loss: 0.0283 - val_acc: 0.9926
Epoch 89/200
 - 1s - loss: 0.0329 - acc: 0.9913 - val_loss: 0.0283 - val_acc: 0.9926
Epoch 90/200
 - 1s - loss: 0.0337 - acc: 0.9909 - val_loss: 0.0282 - val_acc: 0.9926
Epoch 91/200
 - 1s - loss: 0.0330 - acc: 0.9917 - val_loss: 0.0282 - val_acc: 0.9926
Epoch 92/200
 - 1s - loss: 0.0333 - acc: 0.9911 - val_loss: 0.0282 - val_acc: 0.9926
Epoch 93/200
 - 1s - loss: 0.0332 - acc: 0.9911 - val_loss: 0.0282 - val_acc: 0.9926
Epoch 94/200
 - 1s - loss: 0.0325 - acc: 0.9914 - val_loss: 0.0281 - val_acc: 0.9926
Epoch 95/200
 - 1s - loss: 0.0325 - acc: 0.9911 - val_loss: 0.0281 - val_acc: 0.9926
Epoch 96/200
 - 1s - loss: 0.0327 - acc: 0.9916 - val_loss: 0.0281 - val_acc: 0.9926
Epoch 97/200
 - 1s - loss: 0.0334 - acc: 0.9904 - val_loss: 0.0281 - val_acc: 0.9926
Epoch 98/200
 - 1s - loss: 0.0333 - acc: 0.9910 - val_loss: 0.0280 - val_acc: 0.9926
Epoch 99/200
 - 1s - loss: 0.0328 - acc: 0.9908 - val_loss: 0.0280 - val_acc: 0.9926
Epoch 100/200
 - 1s - loss: 0.0324 - acc: 0.9915 - val_loss: 0.0280 - val_acc: 0.9926
Epoch 101/200
 - 1s - loss: 0.0328 - acc: 0.9906 - val_loss: 0.0280 - val_acc: 0.9926
Epoch 102/200
 - 1s - loss: 0.0334 - acc: 0.9907 - val_loss: 0.0279 - val_acc: 0.9926
Epoch 103/200
 - 1s - loss: 0.0327 - acc: 0.9914 - val_loss: 0.0279 - val_acc: 0.9926
Epoch 104/200
 - 1s - loss: 0.0326 - acc: 0.9913 - val_loss: 0.0279 - val_acc: 0.9926
Epoch 105/200
 - 1s - loss: 0.0328 - acc: 0.9912 - val_loss: 0.0279 - val_acc: 0.9926
Epoch 106/200
 - 1s - loss: 0.0328 - acc: 0.9911 - val_loss: 0.0279 - val_acc: 0.9926
Epoch 107/200
 - 1s - loss: 0.0328 - acc: 0.9913 - val_loss: 0.0278 - val_acc: 0.9926
Epoch 108/200
 - 1s - loss: 0.0331 - acc: 0.9909 - val_loss: 0.0278 - val_acc: 0.9926
Epoch 109/200
 - 1s - loss: 0.0330 - acc: 0.9908 - val_loss: 0.0278 - val_acc: 0.9926
Epoch 110/200
 - 1s - loss: 0.0323 - acc: 0.9915 - val_loss: 0.0278 - val_acc: 0.9926
Epoch 111/200
 - 1s - loss: 0.0320 - acc: 0.9914 - val_loss: 0.0277 - val_acc: 0.9926
Epoch 112/200
 - 1s - loss: 0.0328 - acc: 0.9914 - val_loss: 0.0277 - val_acc: 0.9926
Epoch 113/200
 - 1s - loss: 0.0330 - acc: 0.9902 - val_loss: 0.0277 - val_acc: 0.9926
Epoch 114/200
 - 1s - loss: 0.0327 - acc: 0.9909 - val_loss: 0.0277 - val_acc: 0.9926
Epoch 115/200
 - 1s - loss: 0.0329 - acc: 0.9905 - val_loss: 0.0277 - val_acc: 0.9926
Epoch 116/200
 - 1s - loss: 0.0325 - acc: 0.9915 - val_loss: 0.0277 - val_acc: 0.9926
Epoch 117/200
 - 1s - loss: 0.0328 - acc: 0.9912 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 118/200
 - 1s - loss: 0.0328 - acc: 0.9911 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 119/200
 - 1s - loss: 0.0322 - acc: 0.9914 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 120/200
 - 1s - loss: 0.0318 - acc: 0.9909 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 121/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 122/200
 - 1s - loss: 0.0327 - acc: 0.9908 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 123/200
 - 1s - loss: 0.0326 - acc: 0.9909 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 124/200
 - 1s - loss: 0.0325 - acc: 0.9914 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 125/200
 - 1s - loss: 0.0322 - acc: 0.9910 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 126/200
 - 1s - loss: 0.0327 - acc: 0.9909 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 127/200
 - 1s - loss: 0.0322 - acc: 0.9918 - val_loss: 0.0275 - val_acc: 0.9926
Epoch 128/200
 - 1s - loss: 0.0319 - acc: 0.9920 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 129/200
 - 1s - loss: 0.0324 - acc: 0.9911 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 130/200
 - 1s - loss: 0.0322 - acc: 0.9908 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 131/200
 - 1s - loss: 0.0322 - acc: 0.9911 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 132/200
 - 1s - loss: 0.0322 - acc: 0.9909 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 133/200
 - 1s - loss: 0.0324 - acc: 0.9910 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 134/200
 - 1s - loss: 0.0319 - acc: 0.9914 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 135/200
 - 1s - loss: 0.0315 - acc: 0.9914 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 136/200
 - 1s - loss: 0.0316 - acc: 0.9910 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 137/200
 - 1s - loss: 0.0321 - acc: 0.9917 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 138/200
 - 1s - loss: 0.0319 - acc: 0.9914 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 139/200
 - 1s - loss: 0.0320 - acc: 0.9915 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 140/200
 - 1s - loss: 0.0316 - acc: 0.9915 - val_loss: 0.0273 - val_acc: 0.9926
Epoch 141/200
 - 1s - loss: 0.0326 - acc: 0.9911 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 142/200
 - 1s - loss: 0.0323 - acc: 0.9911 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 143/200
 - 1s - loss: 0.0316 - acc: 0.9918 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 144/200
 - 1s - loss: 0.0323 - acc: 0.9913 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 145/200
 - 1s - loss: 0.0310 - acc: 0.9915 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 146/200
 - 1s - loss: 0.0322 - acc: 0.9912 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 147/200
 - 1s - loss: 0.0319 - acc: 0.9909 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 148/200
 - 1s - loss: 0.0316 - acc: 0.9908 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 149/200
 - 1s - loss: 0.0318 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 150/200
 - 1s - loss: 0.0327 - acc: 0.9911 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 151/200
 - 1s - loss: 0.0315 - acc: 0.9919 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 152/200
 - 1s - loss: 0.0322 - acc: 0.9910 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 153/200
 - 1s - loss: 0.0324 - acc: 0.9913 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 154/200
 - 1s - loss: 0.0319 - acc: 0.9909 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 155/200
 - 1s - loss: 0.0320 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 156/200
 - 1s - loss: 0.0324 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 157/200
 - 1s - loss: 0.0323 - acc: 0.9911 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 158/200
 - 1s - loss: 0.0321 - acc: 0.9915 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 159/200
 - 1s - loss: 0.0315 - acc: 0.9915 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 160/200
 - 1s - loss: 0.0321 - acc: 0.9914 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 161/200
 - 1s - loss: 0.0319 - acc: 0.9913 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 162/200
 - 1s - loss: 0.0318 - acc: 0.9916 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 163/200
 - 1s - loss: 0.0317 - acc: 0.9910 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 164/200
 - 1s - loss: 0.0316 - acc: 0.9915 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 165/200
 - 1s - loss: 0.0330 - acc: 0.9904 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 166/200
 - 1s - loss: 0.0319 - acc: 0.9913 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 167/200
 - 1s - loss: 0.0315 - acc: 0.9914 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 168/200
 - 1s - loss: 0.0311 - acc: 0.9915 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 169/200
 - 1s - loss: 0.0318 - acc: 0.9909 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 170/200
 - 1s - loss: 0.0315 - acc: 0.9915 - val_loss: 0.0269 - val_acc: 0.9926
Epoch 171/200
 - 1s - loss: 0.0308 - acc: 0.9914 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 172/200
 - 1s - loss: 0.0317 - acc: 0.9918 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 173/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 174/200
 - 1s - loss: 0.0316 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 175/200
 - 1s - loss: 0.0311 - acc: 0.9918 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 176/200
 - 1s - loss: 0.0323 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 177/200
 - 1s - loss: 0.0314 - acc: 0.9916 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 178/200
 - 1s - loss: 0.0314 - acc: 0.9918 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 179/200
 - 1s - loss: 0.0314 - acc: 0.9915 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 180/200
 - 1s - loss: 0.0319 - acc: 0.9906 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 181/200
 - 1s - loss: 0.0310 - acc: 0.9912 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 182/200
 - 1s - loss: 0.0313 - acc: 0.9919 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 183/200
 - 1s - loss: 0.0317 - acc: 0.9915 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 184/200
 - 1s - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 185/200
 - 1s - loss: 0.0316 - acc: 0.9911 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 186/200
 - 1s - loss: 0.0321 - acc: 0.9905 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 187/200
 - 1s - loss: 0.0312 - acc: 0.9917 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 188/200
 - 1s - loss: 0.0319 - acc: 0.9908 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 189/200
 - 1s - loss: 0.0311 - acc: 0.9914 - val_loss: 0.0267 - val_acc: 0.9926
Epoch 190/200
 - 1s - loss: 0.0319 - acc: 0.9909 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 191/200
 - 1s - loss: 0.0307 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 192/200
 - 1s - loss: 0.0313 - acc: 0.9911 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 193/200
 - 1s - loss: 0.0315 - acc: 0.9910 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 194/200
 - 1s - loss: 0.0309 - acc: 0.9918 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 195/200
 - 1s - loss: 0.0309 - acc: 0.9914 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 196/200
 - 1s - loss: 0.0308 - acc: 0.9917 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 197/200
 - 1s - loss: 0.0311 - acc: 0.9920 - val_loss: 0.0266 - val_acc: 0.9926
Epoch 198/200
 - 1s - loss: 0.0308 - acc: 0.9915 - val_loss: 0.0266 - val_acc: 0.9928
Epoch 199/200
 - 1s - loss: 0.0312 - acc: 0.9918 - val_loss: 0.0266 - val_acc: 0.9928
Epoch 200/200
 - 1s - loss: 0.0320 - acc: 0.9911 - val_loss: 0.0266 - val_acc: 0.9928
2018-03-27 10:53:13,597 [INFO] Evaluate...
2018-03-27 10:53:16,817 [INFO] Done!
2018-03-27 10:53:16,823 [INFO] tpe_transform took 0.002499 seconds
2018-03-27 10:53:16,824 [INFO] TPE using 44/44 trials with best loss 0.011121
2018-03-27 10:53:16,832 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:53:17,819 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0521 - acc: 0.9806 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 2/200
 - 1s - loss: 0.0238 - acc: 0.9924 - val_loss: 0.0273 - val_acc: 0.9912
Epoch 3/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 4/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 5/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 6/200
 - 1s - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0236 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 10/200
 - 1s - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0226 - val_acc: 0.9928
Epoch 13/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0225 - val_acc: 0.9928
Epoch 14/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0224 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0225 - val_acc: 0.9928
Epoch 16/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0223 - val_acc: 0.9926
Epoch 17/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0223 - val_acc: 0.9926
Epoch 18/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0222 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 20/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 21/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9926
Epoch 22/200
 - 1s - loss: 0.0156 - acc: 0.9949 - val_loss: 0.0219 - val_acc: 0.9926
Epoch 23/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9928
Epoch 24/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0219 - val_acc: 0.9928
Epoch 25/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0218 - val_acc: 0.9928
Epoch 26/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0218 - val_acc: 0.9928
Epoch 27/200
 - 1s - loss: 0.0155 - acc: 0.9956 - val_loss: 0.0217 - val_acc: 0.9928
Epoch 28/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0217 - val_acc: 0.9928
Epoch 29/200
 - 1s - loss: 0.0159 - acc: 0.9957 - val_loss: 0.0217 - val_acc: 0.9928
Epoch 30/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0216 - val_acc: 0.9928
Epoch 31/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0217 - val_acc: 0.9928
Epoch 32/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0217 - val_acc: 0.9928
Epoch 33/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 34/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 35/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 36/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 37/200
 - 1s - loss: 0.0148 - acc: 0.9959 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 38/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 39/200
 - 1s - loss: 0.0153 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 40/200
 - 1s - loss: 0.0148 - acc: 0.9959 - val_loss: 0.0215 - val_acc: 0.9928
Epoch 41/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0214 - val_acc: 0.9928
Epoch 42/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0214 - val_acc: 0.9928
Epoch 43/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0214 - val_acc: 0.9928
Epoch 44/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0214 - val_acc: 0.9928
Epoch 45/200
 - 1s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 46/200
 - 1s - loss: 0.0152 - acc: 0.9956 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 47/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 48/200
 - 1s - loss: 0.0145 - acc: 0.9959 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 49/200
 - 1s - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 50/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 51/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 52/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0212 - val_acc: 0.9928
Epoch 53/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0213 - val_acc: 0.9928
Epoch 54/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0212 - val_acc: 0.9928
Epoch 55/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 56/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0212 - val_acc: 0.9928
Epoch 57/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 58/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 59/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 60/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 61/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 62/200
 - 1s - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 63/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 64/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 65/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 66/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 67/200
 - 1s - loss: 0.0145 - acc: 0.9961 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 68/200
 - 1s - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 69/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 70/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 71/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 72/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 73/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 74/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 75/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 76/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 77/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 78/200
 - 1s - loss: 0.0143 - acc: 0.9952 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 79/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 80/200
 - 1s - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 81/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 82/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 83/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 84/200
 - 1s - loss: 0.0139 - acc: 0.9961 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 85/200
 - 1s - loss: 0.0136 - acc: 0.9963 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 86/200
 - 1s - loss: 0.0143 - acc: 0.9959 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 87/200
 - 1s - loss: 0.0140 - acc: 0.9960 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 88/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 89/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 90/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 91/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 92/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 93/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 94/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 95/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 96/200
 - 1s - loss: 0.0138 - acc: 0.9961 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 97/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 98/200
 - 1s - loss: 0.0140 - acc: 0.9960 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 99/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 100/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 101/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 102/200
 - 1s - loss: 0.0135 - acc: 0.9963 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 103/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 104/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 105/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 106/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 107/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 108/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 109/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 110/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 111/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 112/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 113/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 114/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 115/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 116/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 117/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 118/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 119/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 120/200
 - 1s - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 121/200
 - 1s - loss: 0.0136 - acc: 0.9962 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 122/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 123/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 124/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 125/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 126/200
 - 1s - loss: 0.0140 - acc: 0.9960 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 127/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 128/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 129/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 130/200
 - 1s - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 131/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 132/200
 - 1s - loss: 0.0137 - acc: 0.9963 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 133/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 134/200
 - 1s - loss: 0.0139 - acc: 0.9957 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 135/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 136/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 137/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 138/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 139/200
 - 1s - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0207 - val_acc: 0.9932
Epoch 142/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 143/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 144/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 145/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0134 - acc: 0.9957 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0131 - acc: 0.9964 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0134 - acc: 0.9966 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 163/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0135 - acc: 0.9963 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0138 - acc: 0.9953 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0132 - acc: 0.9963 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0133 - acc: 0.9961 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0206 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 175/200
 - 1s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 176/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 177/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 178/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 179/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 180/200
 - 1s - loss: 0.0129 - acc: 0.9962 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 181/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 182/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 183/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 184/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 185/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 186/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9932
Epoch 187/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 188/200
 - 1s - loss: 0.0131 - acc: 0.9957 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 189/200
 - 1s - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 190/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 191/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 192/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 193/200
 - 1s - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 194/200
 - 1s - loss: 0.0133 - acc: 0.9963 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 195/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 196/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 197/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 198/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 199/200
 - 1s - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 200/200
 - 1s - loss: 0.0129 - acc: 0.9962 - val_loss: 0.0205 - val_acc: 0.9934
2018-03-27 10:56:28,206 [INFO] Evaluate...
2018-03-27 10:56:31,484 [INFO] Done!
2018-03-27 10:56:31,491 [INFO] tpe_transform took 0.002448 seconds
2018-03-27 10:56:31,491 [INFO] TPE using 45/45 trials with best loss 0.011121
2018-03-27 10:56:31,499 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:56:32,487 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0760 - acc: 0.9701 - val_loss: 0.0324 - val_acc: 0.9900
Epoch 2/200
 - 1s - loss: 0.0332 - acc: 0.9894 - val_loss: 0.0295 - val_acc: 0.9912
Epoch 3/200
 - 1s - loss: 0.0317 - acc: 0.9898 - val_loss: 0.0281 - val_acc: 0.9918
Epoch 4/200
 - 1s - loss: 0.0297 - acc: 0.9904 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 5/200
 - 1s - loss: 0.0285 - acc: 0.9913 - val_loss: 0.0268 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0280 - acc: 0.9915 - val_loss: 0.0264 - val_acc: 0.9926
Epoch 7/200
 - 1s - loss: 0.0276 - acc: 0.9917 - val_loss: 0.0260 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0267 - acc: 0.9917 - val_loss: 0.0258 - val_acc: 0.9930
Epoch 9/200
 - 1s - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0257 - val_acc: 0.9926
Epoch 10/200
 - 1s - loss: 0.0265 - acc: 0.9915 - val_loss: 0.0255 - val_acc: 0.9926
Epoch 11/200
 - 1s - loss: 0.0265 - acc: 0.9918 - val_loss: 0.0253 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0253 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0254 - acc: 0.9916 - val_loss: 0.0250 - val_acc: 0.9930
Epoch 14/200
 - 1s - loss: 0.0260 - acc: 0.9921 - val_loss: 0.0249 - val_acc: 0.9930
Epoch 15/200
 - 1s - loss: 0.0257 - acc: 0.9923 - val_loss: 0.0248 - val_acc: 0.9930
Epoch 16/200
 - 1s - loss: 0.0257 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9930
Epoch 17/200
 - 1s - loss: 0.0255 - acc: 0.9921 - val_loss: 0.0246 - val_acc: 0.9930
Epoch 18/200
 - 1s - loss: 0.0255 - acc: 0.9918 - val_loss: 0.0245 - val_acc: 0.9930
Epoch 19/200
 - 1s - loss: 0.0255 - acc: 0.9917 - val_loss: 0.0245 - val_acc: 0.9930
Epoch 20/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0244 - val_acc: 0.9930
Epoch 21/200
 - 1s - loss: 0.0239 - acc: 0.9925 - val_loss: 0.0244 - val_acc: 0.9930
Epoch 22/200
 - 1s - loss: 0.0244 - acc: 0.9923 - val_loss: 0.0244 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0250 - acc: 0.9918 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0250 - acc: 0.9921 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0245 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 26/200
 - 1s - loss: 0.0253 - acc: 0.9919 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 27/200
 - 1s - loss: 0.0239 - acc: 0.9927 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0247 - acc: 0.9921 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0243 - acc: 0.9920 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 33/200
 - 1s - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 34/200
 - 1s - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 35/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 36/200
 - 1s - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 37/200
 - 1s - loss: 0.0253 - acc: 0.9917 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 38/200
 - 1s - loss: 0.0239 - acc: 0.9920 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 39/200
 - 1s - loss: 0.0231 - acc: 0.9927 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 40/200
 - 1s - loss: 0.0250 - acc: 0.9923 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 41/200
 - 1s - loss: 0.0229 - acc: 0.9926 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 42/200
 - 1s - loss: 0.0241 - acc: 0.9919 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 43/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 44/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 45/200
 - 1s - loss: 0.0237 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 46/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 47/200
 - 1s - loss: 0.0236 - acc: 0.9924 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 48/200
 - 1s - loss: 0.0232 - acc: 0.9920 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 49/200
 - 1s - loss: 0.0236 - acc: 0.9924 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 50/200
 - 1s - loss: 0.0237 - acc: 0.9922 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 51/200
 - 1s - loss: 0.0236 - acc: 0.9924 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 52/200
 - 1s - loss: 0.0234 - acc: 0.9925 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 53/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 54/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 55/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 56/200
 - 1s - loss: 0.0235 - acc: 0.9918 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 57/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 58/200
 - 1s - loss: 0.0235 - acc: 0.9923 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 60/200
 - 1s - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 61/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 62/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 63/200
 - 1s - loss: 0.0226 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 64/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 66/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 67/200
 - 1s - loss: 0.0231 - acc: 0.9926 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 68/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 69/200
 - 1s - loss: 0.0224 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9928
Epoch 70/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 71/200
 - 1s - loss: 0.0237 - acc: 0.9921 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 72/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 73/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 74/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 75/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 76/200
 - 1s - loss: 0.0227 - acc: 0.9927 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 77/200
 - 1s - loss: 0.0229 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 78/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9928
Epoch 79/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0226 - acc: 0.9924 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 81/200
 - 1s - loss: 0.0226 - acc: 0.9927 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 82/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 83/200
 - 1s - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 84/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 85/200
 - 1s - loss: 0.0222 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 86/200
 - 1s - loss: 0.0218 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 87/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 88/200
 - 1s - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 89/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 90/200
 - 1s - loss: 0.0223 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 91/200
 - 1s - loss: 0.0223 - acc: 0.9930 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 92/200
 - 1s - loss: 0.0227 - acc: 0.9927 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 93/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 94/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 95/200
 - 1s - loss: 0.0225 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 96/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 97/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 98/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 99/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0229 - val_acc: 0.9928
Epoch 100/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 101/200
 - 1s - loss: 0.0225 - acc: 0.9927 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 102/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 103/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 104/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 105/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 106/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 107/200
 - 1s - loss: 0.0229 - acc: 0.9922 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 108/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 109/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 110/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 111/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 112/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 113/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 114/200
 - 1s - loss: 0.0230 - acc: 0.9925 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 115/200
 - 1s - loss: 0.0226 - acc: 0.9920 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 116/200
 - 1s - loss: 0.0226 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 117/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 118/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 119/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 120/200
 - 1s - loss: 0.0219 - acc: 0.9925 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 121/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 122/200
 - 1s - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 123/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 124/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 125/200
 - 1s - loss: 0.0228 - acc: 0.9923 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 126/200
 - 1s - loss: 0.0211 - acc: 0.9932 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 127/200
 - 1s - loss: 0.0218 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 128/200
 - 1s - loss: 0.0228 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 129/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 130/200
 - 1s - loss: 0.0216 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 131/200
 - 1s - loss: 0.0221 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 132/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 133/200
 - 1s - loss: 0.0223 - acc: 0.9927 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 134/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 135/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 136/200
 - 1s - loss: 0.0229 - acc: 0.9926 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 137/200
 - 1s - loss: 0.0225 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 138/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 139/200
 - 1s - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 140/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 141/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 142/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 143/200
 - 1s - loss: 0.0222 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 144/200
 - 1s - loss: 0.0219 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 145/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 146/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 147/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 148/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 149/200
 - 1s - loss: 0.0223 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 150/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 151/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 152/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 153/200
 - 1s - loss: 0.0224 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 154/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 155/200
 - 1s - loss: 0.0228 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 156/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 157/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 158/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 159/200
 - 1s - loss: 0.0213 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 160/200
 - 1s - loss: 0.0210 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 161/200
 - 1s - loss: 0.0214 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 162/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 163/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 164/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 165/200
 - 1s - loss: 0.0225 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 166/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 167/200
 - 1s - loss: 0.0221 - acc: 0.9923 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 168/200
 - 1s - loss: 0.0210 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 169/200
 - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 170/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 171/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 172/200
 - 1s - loss: 0.0209 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 173/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 174/200
 - 1s - loss: 0.0218 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 175/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 176/200
 - 1s - loss: 0.0217 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 177/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 178/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 179/200
 - 1s - loss: 0.0224 - acc: 0.9930 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 180/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 181/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 182/200
 - 1s - loss: 0.0218 - acc: 0.9930 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 183/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 184/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 185/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 186/200
 - 1s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 187/200
 - 1s - loss: 0.0224 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9930
Epoch 188/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0225 - val_acc: 0.9930
2018-03-27 10:59:33,293 [INFO] Evaluate...
2018-03-27 10:59:36,576 [INFO] Done!
2018-03-27 10:59:36,583 [INFO] tpe_transform took 0.003281 seconds
2018-03-27 10:59:36,584 [INFO] TPE using 46/46 trials with best loss 0.011121
2018-03-27 10:59:36,591 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 10:59:37,583 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 5s - loss: 0.0816 - acc: 0.9788 - val_loss: 0.0443 - val_acc: 0.9906
Epoch 2/200
 - 1s - loss: 0.0427 - acc: 0.9905 - val_loss: 0.0375 - val_acc: 0.9916
Epoch 3/200
 - 1s - loss: 0.0387 - acc: 0.9911 - val_loss: 0.0348 - val_acc: 0.9914
Epoch 4/200
 - 1s - loss: 0.0367 - acc: 0.9912 - val_loss: 0.0333 - val_acc: 0.9914
Epoch 5/200
 - 1s - loss: 0.0352 - acc: 0.9914 - val_loss: 0.0323 - val_acc: 0.9914
Epoch 6/200
 - 1s - loss: 0.0344 - acc: 0.9915 - val_loss: 0.0315 - val_acc: 0.9914
Epoch 7/200
 - 1s - loss: 0.0337 - acc: 0.9921 - val_loss: 0.0309 - val_acc: 0.9914
Epoch 8/200
 - 1s - loss: 0.0328 - acc: 0.9920 - val_loss: 0.0304 - val_acc: 0.9914
Epoch 9/200
 - 1s - loss: 0.0326 - acc: 0.9919 - val_loss: 0.0300 - val_acc: 0.9916
Epoch 10/200
 - 1s - loss: 0.0321 - acc: 0.9919 - val_loss: 0.0297 - val_acc: 0.9916
Epoch 11/200
 - 1s - loss: 0.0317 - acc: 0.9922 - val_loss: 0.0294 - val_acc: 0.9918
Epoch 12/200
 - 1s - loss: 0.0314 - acc: 0.9922 - val_loss: 0.0291 - val_acc: 0.9918
Epoch 13/200
 - 1s - loss: 0.0309 - acc: 0.9922 - val_loss: 0.0289 - val_acc: 0.9918
Epoch 14/200
 - 1s - loss: 0.0309 - acc: 0.9926 - val_loss: 0.0287 - val_acc: 0.9918
Epoch 15/200
 - 1s - loss: 0.0305 - acc: 0.9922 - val_loss: 0.0285 - val_acc: 0.9918
Epoch 16/200
 - 1s - loss: 0.0304 - acc: 0.9926 - val_loss: 0.0283 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0307 - acc: 0.9926 - val_loss: 0.0282 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0302 - acc: 0.9923 - val_loss: 0.0280 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0298 - acc: 0.9921 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0298 - acc: 0.9926 - val_loss: 0.0278 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0296 - acc: 0.9923 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 22/200
 - 1s - loss: 0.0298 - acc: 0.9924 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 23/200
 - 1s - loss: 0.0296 - acc: 0.9919 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0292 - acc: 0.9924 - val_loss: 0.0273 - val_acc: 0.9922
Epoch 25/200
 - 1s - loss: 0.0292 - acc: 0.9925 - val_loss: 0.0272 - val_acc: 0.9922
Epoch 26/200
 - 1s - loss: 0.0294 - acc: 0.9922 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 27/200
 - 1s - loss: 0.0291 - acc: 0.9926 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 28/200
 - 1s - loss: 0.0286 - acc: 0.9930 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 29/200
 - 1s - loss: 0.0288 - acc: 0.9928 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 30/200
 - 1s - loss: 0.0290 - acc: 0.9925 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 31/200
 - 1s - loss: 0.0288 - acc: 0.9926 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 32/200
 - 1s - loss: 0.0288 - acc: 0.9927 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 33/200
 - 1s - loss: 0.0285 - acc: 0.9924 - val_loss: 0.0266 - val_acc: 0.9920
Epoch 34/200
 - 1s - loss: 0.0282 - acc: 0.9931 - val_loss: 0.0266 - val_acc: 0.9920
Epoch 35/200
 - 1s - loss: 0.0286 - acc: 0.9926 - val_loss: 0.0265 - val_acc: 0.9920
Epoch 36/200
 - 1s - loss: 0.0286 - acc: 0.9926 - val_loss: 0.0265 - val_acc: 0.9920
Epoch 37/200
 - 1s - loss: 0.0282 - acc: 0.9929 - val_loss: 0.0264 - val_acc: 0.9920
Epoch 38/200
 - 1s - loss: 0.0286 - acc: 0.9928 - val_loss: 0.0264 - val_acc: 0.9920
Epoch 39/200
 - 1s - loss: 0.0280 - acc: 0.9927 - val_loss: 0.0263 - val_acc: 0.9920
Epoch 40/200
 - 1s - loss: 0.0283 - acc: 0.9928 - val_loss: 0.0263 - val_acc: 0.9920
Epoch 41/200
 - 1s - loss: 0.0283 - acc: 0.9926 - val_loss: 0.0262 - val_acc: 0.9920
Epoch 42/200
 - 1s - loss: 0.0281 - acc: 0.9928 - val_loss: 0.0262 - val_acc: 0.9920
Epoch 43/200
 - 1s - loss: 0.0275 - acc: 0.9932 - val_loss: 0.0261 - val_acc: 0.9920
Epoch 44/200
 - 1s - loss: 0.0280 - acc: 0.9929 - val_loss: 0.0261 - val_acc: 0.9920
Epoch 45/200
 - 1s - loss: 0.0279 - acc: 0.9929 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 46/200
 - 1s - loss: 0.0278 - acc: 0.9929 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 47/200
 - 1s - loss: 0.0278 - acc: 0.9923 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 48/200
 - 1s - loss: 0.0279 - acc: 0.9928 - val_loss: 0.0259 - val_acc: 0.9920
Epoch 49/200
 - 1s - loss: 0.0278 - acc: 0.9928 - val_loss: 0.0259 - val_acc: 0.9920
Epoch 50/200
 - 1s - loss: 0.0277 - acc: 0.9929 - val_loss: 0.0259 - val_acc: 0.9920
Epoch 51/200
 - 1s - loss: 0.0280 - acc: 0.9926 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 52/200
 - 1s - loss: 0.0278 - acc: 0.9929 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 53/200
 - 1s - loss: 0.0275 - acc: 0.9925 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 54/200
 - 1s - loss: 0.0274 - acc: 0.9927 - val_loss: 0.0257 - val_acc: 0.9920
Epoch 55/200
 - 1s - loss: 0.0276 - acc: 0.9926 - val_loss: 0.0257 - val_acc: 0.9920
Epoch 56/200
 - 1s - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0257 - val_acc: 0.9920
Epoch 57/200
 - 1s - loss: 0.0274 - acc: 0.9932 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 58/200
 - 1s - loss: 0.0274 - acc: 0.9929 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 59/200
 - 1s - loss: 0.0275 - acc: 0.9928 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 60/200
 - 1s - loss: 0.0274 - acc: 0.9929 - val_loss: 0.0255 - val_acc: 0.9920
Epoch 61/200
 - 1s - loss: 0.0274 - acc: 0.9927 - val_loss: 0.0255 - val_acc: 0.9920
Epoch 62/200
 - 1s - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0255 - val_acc: 0.9920
Epoch 63/200
 - 1s - loss: 0.0268 - acc: 0.9930 - val_loss: 0.0255 - val_acc: 0.9920
Epoch 64/200
 - 1s - loss: 0.0271 - acc: 0.9931 - val_loss: 0.0254 - val_acc: 0.9920
Epoch 65/200
 - 1s - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0254 - val_acc: 0.9920
Epoch 66/200
 - 1s - loss: 0.0271 - acc: 0.9930 - val_loss: 0.0254 - val_acc: 0.9920
Epoch 67/200
 - 1s - loss: 0.0271 - acc: 0.9929 - val_loss: 0.0254 - val_acc: 0.9920
Epoch 68/200
 - 1s - loss: 0.0271 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 69/200
 - 1s - loss: 0.0274 - acc: 0.9927 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 70/200
 - 1s - loss: 0.0272 - acc: 0.9927 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 71/200
 - 1s - loss: 0.0272 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 72/200
 - 1s - loss: 0.0272 - acc: 0.9929 - val_loss: 0.0253 - val_acc: 0.9920
Epoch 73/200
 - 1s - loss: 0.0269 - acc: 0.9928 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 74/200
 - 1s - loss: 0.0272 - acc: 0.9926 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 75/200
 - 1s - loss: 0.0270 - acc: 0.9934 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 76/200
 - 1s - loss: 0.0267 - acc: 0.9931 - val_loss: 0.0252 - val_acc: 0.9920
Epoch 77/200
 - 1s - loss: 0.0268 - acc: 0.9927 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 78/200
 - 1s - loss: 0.0269 - acc: 0.9930 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 79/200
 - 1s - loss: 0.0269 - acc: 0.9928 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 80/200
 - 1s - loss: 0.0271 - acc: 0.9929 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 81/200
 - 1s - loss: 0.0269 - acc: 0.9928 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 82/200
 - 1s - loss: 0.0269 - acc: 0.9924 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 83/200
 - 1s - loss: 0.0268 - acc: 0.9931 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 84/200
 - 1s - loss: 0.0271 - acc: 0.9925 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 85/200
 - 1s - loss: 0.0266 - acc: 0.9930 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 86/200
 - 1s - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 87/200
 - 1s - loss: 0.0271 - acc: 0.9928 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 88/200
 - 1s - loss: 0.0265 - acc: 0.9930 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 89/200
 - 1s - loss: 0.0264 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 90/200
 - 1s - loss: 0.0264 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 91/200
 - 1s - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 92/200
 - 1s - loss: 0.0266 - acc: 0.9930 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 93/200
 - 1s - loss: 0.0269 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 94/200
 - 1s - loss: 0.0267 - acc: 0.9931 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 95/200
 - 1s - loss: 0.0265 - acc: 0.9931 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 96/200
 - 1s - loss: 0.0266 - acc: 0.9931 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 97/200
 - 1s - loss: 0.0265 - acc: 0.9932 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 98/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 99/200
 - 1s - loss: 0.0266 - acc: 0.9932 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 100/200
 - 1s - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 101/200
 - 1s - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 102/200
 - 1s - loss: 0.0263 - acc: 0.9926 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 103/200
 - 1s - loss: 0.0265 - acc: 0.9927 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 104/200
 - 1s - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 105/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 106/200
 - 1s - loss: 0.0267 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 107/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 108/200
 - 1s - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 109/200
 - 1s - loss: 0.0261 - acc: 0.9930 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 110/200
 - 1s - loss: 0.0260 - acc: 0.9931 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 111/200
 - 1s - loss: 0.0262 - acc: 0.9935 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 112/200
 - 1s - loss: 0.0263 - acc: 0.9931 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 113/200
 - 1s - loss: 0.0262 - acc: 0.9929 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 114/200
 - 1s - loss: 0.0264 - acc: 0.9928 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 115/200
 - 1s - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0246 - val_acc: 0.9924
Epoch 116/200
 - 1s - loss: 0.0263 - acc: 0.9930 - val_loss: 0.0246 - val_acc: 0.9924
Epoch 117/200
 - 1s - loss: 0.0263 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 118/200
 - 1s - loss: 0.0262 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 119/200
 - 1s - loss: 0.0260 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 120/200
 - 1s - loss: 0.0263 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 121/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 122/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 123/200
 - 1s - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 124/200
 - 1s - loss: 0.0261 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 125/200
 - 1s - loss: 0.0261 - acc: 0.9933 - val_loss: 0.0245 - val_acc: 0.9924
Epoch 126/200
 - 1s - loss: 0.0258 - acc: 0.9932 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 127/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 128/200
 - 1s - loss: 0.0257 - acc: 0.9933 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 129/200
 - 1s - loss: 0.0257 - acc: 0.9932 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 130/200
 - 1s - loss: 0.0262 - acc: 0.9931 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 131/200
 - 1s - loss: 0.0261 - acc: 0.9927 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 132/200
 - 1s - loss: 0.0257 - acc: 0.9933 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 133/200
 - 1s - loss: 0.0261 - acc: 0.9931 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 134/200
 - 1s - loss: 0.0262 - acc: 0.9928 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 135/200
 - 1s - loss: 0.0259 - acc: 0.9932 - val_loss: 0.0244 - val_acc: 0.9924
Epoch 136/200
 - 1s - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 137/200
 - 1s - loss: 0.0261 - acc: 0.9928 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 138/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 139/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 140/200
 - 1s - loss: 0.0257 - acc: 0.9935 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 141/200
 - 1s - loss: 0.0264 - acc: 0.9927 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 142/200
 - 1s - loss: 0.0258 - acc: 0.9935 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 143/200
 - 1s - loss: 0.0254 - acc: 0.9933 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 144/200
 - 1s - loss: 0.0259 - acc: 0.9932 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 145/200
 - 1s - loss: 0.0261 - acc: 0.9931 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 146/200
 - 1s - loss: 0.0260 - acc: 0.9929 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 147/200
 - 1s - loss: 0.0260 - acc: 0.9932 - val_loss: 0.0242 - val_acc: 0.9924
Epoch 148/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9924
Epoch 149/200
 - 1s - loss: 0.0256 - acc: 0.9933 - val_loss: 0.0242 - val_acc: 0.9924
Epoch 150/200
 - 1s - loss: 0.0260 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9924
Epoch 151/200
 - 1s - loss: 0.0261 - acc: 0.9933 - val_loss: 0.0242 - val_acc: 0.9924
Epoch 152/200
 - 1s - loss: 0.0256 - acc: 0.9932 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 153/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 154/200
 - 1s - loss: 0.0258 - acc: 0.9933 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 155/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 156/200
 - 1s - loss: 0.0252 - acc: 0.9937 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 157/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 158/200
 - 1s - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9926
Epoch 159/200
 - 1s - loss: 0.0256 - acc: 0.9935 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 160/200
 - 1s - loss: 0.0262 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 161/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 162/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 163/200
 - 1s - loss: 0.0256 - acc: 0.9934 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 164/200
 - 1s - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 165/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 166/200
 - 1s - loss: 0.0258 - acc: 0.9933 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 167/200
 - 1s - loss: 0.0259 - acc: 0.9933 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 168/200
 - 1s - loss: 0.0255 - acc: 0.9932 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 169/200
 - 1s - loss: 0.0254 - acc: 0.9937 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 170/200
 - 1s - loss: 0.0258 - acc: 0.9928 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 171/200
 - 1s - loss: 0.0260 - acc: 0.9926 - val_loss: 0.0241 - val_acc: 0.9926
Epoch 172/200
 - 1s - loss: 0.0256 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 173/200
 - 1s - loss: 0.0254 - acc: 0.9933 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 174/200
 - 1s - loss: 0.0254 - acc: 0.9933 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 175/200
 - 1s - loss: 0.0254 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 176/200
 - 1s - loss: 0.0256 - acc: 0.9933 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 177/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 178/200
 - 1s - loss: 0.0256 - acc: 0.9928 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 179/200
 - 1s - loss: 0.0253 - acc: 0.9934 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 180/200
 - 1s - loss: 0.0256 - acc: 0.9934 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 181/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 182/200
 - 1s - loss: 0.0256 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 183/200
 - 1s - loss: 0.0256 - acc: 0.9929 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 184/200
 - 1s - loss: 0.0254 - acc: 0.9935 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 185/200
 - 1s - loss: 0.0256 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 186/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0240 - val_acc: 0.9926
Epoch 187/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 188/200
 - 1s - loss: 0.0257 - acc: 0.9930 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 189/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 190/200
 - 1s - loss: 0.0252 - acc: 0.9933 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 191/200
 - 1s - loss: 0.0255 - acc: 0.9935 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 192/200
 - 1s - loss: 0.0257 - acc: 0.9928 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 193/200
 - 1s - loss: 0.0255 - acc: 0.9933 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 194/200
 - 1s - loss: 0.0255 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 195/200
 - 1s - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 196/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 197/200
 - 1s - loss: 0.0254 - acc: 0.9935 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 198/200
 - 1s - loss: 0.0255 - acc: 0.9932 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 199/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0239 - val_acc: 0.9926
Epoch 200/200
 - 1s - loss: 0.0254 - acc: 0.9931 - val_loss: 0.0239 - val_acc: 0.9926
2018-03-27 11:02:49,796 [INFO] Evaluate...
2018-03-27 11:02:53,148 [INFO] Done!
2018-03-27 11:02:53,154 [INFO] tpe_transform took 0.002455 seconds
2018-03-27 11:02:53,155 [INFO] TPE using 47/47 trials with best loss 0.011121
2018-03-27 11:02:53,162 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:02:54,150 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0695 - acc: 0.9778 - val_loss: 0.0387 - val_acc: 0.9904
Epoch 2/200
 - 1s - loss: 0.0349 - acc: 0.9899 - val_loss: 0.0345 - val_acc: 0.9908
Epoch 3/200
 - 1s - loss: 0.0317 - acc: 0.9909 - val_loss: 0.0326 - val_acc: 0.9910
Epoch 4/200
 - 1s - loss: 0.0306 - acc: 0.9914 - val_loss: 0.0314 - val_acc: 0.9906
Epoch 5/200
 - 1s - loss: 0.0291 - acc: 0.9919 - val_loss: 0.0308 - val_acc: 0.9908
Epoch 6/200
 - 1s - loss: 0.0287 - acc: 0.9922 - val_loss: 0.0302 - val_acc: 0.9914
Epoch 7/200
 - 1s - loss: 0.0276 - acc: 0.9923 - val_loss: 0.0298 - val_acc: 0.9916
Epoch 8/200
 - 1s - loss: 0.0275 - acc: 0.9919 - val_loss: 0.0294 - val_acc: 0.9914
Epoch 9/200
 - 1s - loss: 0.0269 - acc: 0.9924 - val_loss: 0.0290 - val_acc: 0.9912
Epoch 10/200
 - 1s - loss: 0.0264 - acc: 0.9925 - val_loss: 0.0288 - val_acc: 0.9916
Epoch 11/200
 - 1s - loss: 0.0265 - acc: 0.9926 - val_loss: 0.0286 - val_acc: 0.9918
Epoch 12/200
 - 1s - loss: 0.0264 - acc: 0.9925 - val_loss: 0.0284 - val_acc: 0.9918
Epoch 13/200
 - 1s - loss: 0.0266 - acc: 0.9920 - val_loss: 0.0283 - val_acc: 0.9918
Epoch 14/200
 - 1s - loss: 0.0258 - acc: 0.9927 - val_loss: 0.0281 - val_acc: 0.9918
Epoch 15/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0280 - val_acc: 0.9918
Epoch 16/200
 - 1s - loss: 0.0254 - acc: 0.9924 - val_loss: 0.0279 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0252 - acc: 0.9922 - val_loss: 0.0277 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0276 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0246 - acc: 0.9924 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0275 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0274 - val_acc: 0.9920
Epoch 22/200
 - 1s - loss: 0.0247 - acc: 0.9924 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 23/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0272 - val_acc: 0.9922
Epoch 25/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0271 - val_acc: 0.9922
Epoch 26/200
 - 1s - loss: 0.0244 - acc: 0.9927 - val_loss: 0.0270 - val_acc: 0.9922
Epoch 27/200
 - 1s - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0270 - val_acc: 0.9922
Epoch 28/200
 - 1s - loss: 0.0239 - acc: 0.9928 - val_loss: 0.0269 - val_acc: 0.9922
Epoch 29/200
 - 1s - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0268 - val_acc: 0.9922
Epoch 30/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0268 - val_acc: 0.9922
Epoch 31/200
 - 1s - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0267 - val_acc: 0.9922
Epoch 32/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0267 - val_acc: 0.9922
Epoch 33/200
 - 1s - loss: 0.0240 - acc: 0.9922 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 34/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 35/200
 - 1s - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0266 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0265 - val_acc: 0.9922
Epoch 37/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0265 - val_acc: 0.9924
Epoch 38/200
 - 1s - loss: 0.0231 - acc: 0.9935 - val_loss: 0.0264 - val_acc: 0.9924
Epoch 39/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0264 - val_acc: 0.9924
Epoch 40/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0264 - val_acc: 0.9924
Epoch 41/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0263 - val_acc: 0.9924
Epoch 42/200
 - 1s - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0263 - val_acc: 0.9924
Epoch 43/200
 - 1s - loss: 0.0233 - acc: 0.9928 - val_loss: 0.0263 - val_acc: 0.9924
Epoch 44/200
 - 1s - loss: 0.0235 - acc: 0.9927 - val_loss: 0.0263 - val_acc: 0.9924
Epoch 45/200
 - 1s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.0262 - val_acc: 0.9924
Epoch 46/200
 - 1s - loss: 0.0235 - acc: 0.9930 - val_loss: 0.0262 - val_acc: 0.9924
Epoch 47/200
 - 1s - loss: 0.0232 - acc: 0.9927 - val_loss: 0.0262 - val_acc: 0.9924
Epoch 48/200
 - 1s - loss: 0.0234 - acc: 0.9924 - val_loss: 0.0262 - val_acc: 0.9924
Epoch 49/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0261 - val_acc: 0.9924
Epoch 50/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0261 - val_acc: 0.9924
Epoch 51/200
 - 1s - loss: 0.0236 - acc: 0.9926 - val_loss: 0.0261 - val_acc: 0.9924
Epoch 52/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0260 - val_acc: 0.9924
Epoch 53/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0260 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0260 - val_acc: 0.9922
Epoch 55/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0260 - val_acc: 0.9924
Epoch 56/200
 - 1s - loss: 0.0232 - acc: 0.9935 - val_loss: 0.0260 - val_acc: 0.9924
Epoch 57/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0259 - val_acc: 0.9924
Epoch 58/200
 - 1s - loss: 0.0229 - acc: 0.9926 - val_loss: 0.0259 - val_acc: 0.9924
Epoch 59/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0259 - val_acc: 0.9924
Epoch 60/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0259 - val_acc: 0.9924
Epoch 61/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0259 - val_acc: 0.9924
Epoch 62/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0258 - val_acc: 0.9924
Epoch 63/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0258 - val_acc: 0.9922
Epoch 64/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0258 - val_acc: 0.9922
Epoch 65/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0258 - val_acc: 0.9922
Epoch 66/200
 - 1s - loss: 0.0228 - acc: 0.9934 - val_loss: 0.0258 - val_acc: 0.9922
Epoch 67/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 68/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 69/200
 - 1s - loss: 0.0228 - acc: 0.9927 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 70/200
 - 1s - loss: 0.0226 - acc: 0.9928 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 71/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 72/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 73/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 74/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0256 - val_acc: 0.9922
Epoch 75/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0256 - val_acc: 0.9922
Epoch 76/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0256 - val_acc: 0.9922
Epoch 77/200
 - 1s - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0256 - val_acc: 0.9922
Epoch 78/200
 - 1s - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0256 - val_acc: 0.9922
Epoch 79/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0256 - val_acc: 0.9924
Epoch 80/200
 - 1s - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0256 - val_acc: 0.9924
Epoch 81/200
 - 1s - loss: 0.0223 - acc: 0.9935 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 82/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 83/200
 - 1s - loss: 0.0221 - acc: 0.9936 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 84/200
 - 1s - loss: 0.0220 - acc: 0.9940 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 85/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 86/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 87/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 88/200
 - 1s - loss: 0.0223 - acc: 0.9934 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 89/200
 - 1s - loss: 0.0225 - acc: 0.9932 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 90/200
 - 1s - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 91/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 92/200
 - 1s - loss: 0.0222 - acc: 0.9938 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 93/200
 - 1s - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 94/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 95/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 96/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 97/200
 - 1s - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 98/200
 - 1s - loss: 0.0221 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 99/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 100/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0253 - val_acc: 0.9922
Epoch 101/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 102/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0253 - val_acc: 0.9922
Epoch 103/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0253 - val_acc: 0.9922
Epoch 104/200
 - 1s - loss: 0.0225 - acc: 0.9930 - val_loss: 0.0253 - val_acc: 0.9922
Epoch 105/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0253 - val_acc: 0.9922
Epoch 106/200
 - 1s - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0253 - val_acc: 0.9922
Epoch 107/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 108/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 109/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 110/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 111/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 112/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 113/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 114/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 115/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 116/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 117/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 118/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0252 - val_acc: 0.9922
Epoch 119/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 120/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 121/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 122/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 123/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 124/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0251 - val_acc: 0.9924
Epoch 125/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 126/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 127/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 128/200
 - 1s - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 129/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 130/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 131/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0251 - val_acc: 0.9922
Epoch 132/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0251 - val_acc: 0.9924
Epoch 133/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0251 - val_acc: 0.9924
Epoch 134/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0250 - val_acc: 0.9924
Epoch 135/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 136/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 137/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 138/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 139/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 140/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 141/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 142/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 143/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 144/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 145/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 146/200
 - 1s - loss: 0.0216 - acc: 0.9939 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 147/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0250 - val_acc: 0.9922
Epoch 148/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 149/200
 - 1s - loss: 0.0208 - acc: 0.9942 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 150/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 151/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 152/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 153/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 154/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 155/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 156/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 157/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 158/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 159/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 160/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 161/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 162/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 163/200
 - 1s - loss: 0.0217 - acc: 0.9934 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 164/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0249 - val_acc: 0.9922
Epoch 165/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 166/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 167/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 168/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 169/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 170/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 171/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 172/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 173/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 174/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 175/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 176/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 177/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 178/200
 - 1s - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 179/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 180/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 181/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 182/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 183/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 184/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0248 - val_acc: 0.9922
Epoch 185/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 186/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 187/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 188/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 189/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 190/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 191/200
 - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 192/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 193/200
 - 1s - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 194/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 195/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 196/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 197/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 198/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 199/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0247 - val_acc: 0.9922
Epoch 200/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0247 - val_acc: 0.9922
2018-03-27 11:06:06,917 [INFO] Evaluate...
2018-03-27 11:06:10,302 [INFO] Done!
2018-03-27 11:06:10,308 [INFO] tpe_transform took 0.002474 seconds
2018-03-27 11:06:10,309 [INFO] TPE using 48/48 trials with best loss 0.011121
2018-03-27 11:06:10,317 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:06:11,306 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0630 - acc: 0.9736 - val_loss: 0.0283 - val_acc: 0.9922
Epoch 2/200
 - 1s - loss: 0.0287 - acc: 0.9899 - val_loss: 0.0261 - val_acc: 0.9924
Epoch 3/200
 - 1s - loss: 0.0262 - acc: 0.9914 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 4/200
 - 1s - loss: 0.0262 - acc: 0.9914 - val_loss: 0.0246 - val_acc: 0.9924
Epoch 5/200
 - 1s - loss: 0.0258 - acc: 0.9914 - val_loss: 0.0244 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0251 - acc: 0.9920 - val_loss: 0.0238 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0247 - acc: 0.9921 - val_loss: 0.0237 - val_acc: 0.9928
Epoch 10/200
 - 1s - loss: 0.0239 - acc: 0.9919 - val_loss: 0.0235 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0234 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9928
Epoch 13/200
 - 1s - loss: 0.0237 - acc: 0.9921 - val_loss: 0.0231 - val_acc: 0.9932
Epoch 14/200
 - 1s - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9932
Epoch 15/200
 - 1s - loss: 0.0234 - acc: 0.9923 - val_loss: 0.0230 - val_acc: 0.9932
Epoch 16/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0230 - val_acc: 0.9932
Epoch 17/200
 - 1s - loss: 0.0235 - acc: 0.9924 - val_loss: 0.0229 - val_acc: 0.9932
Epoch 18/200
 - 1s - loss: 0.0229 - acc: 0.9930 - val_loss: 0.0228 - val_acc: 0.9932
Epoch 19/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0228 - val_acc: 0.9932
Epoch 20/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9932
Epoch 21/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9932
Epoch 22/200
 - 1s - loss: 0.0226 - acc: 0.9924 - val_loss: 0.0227 - val_acc: 0.9932
Epoch 23/200
 - 1s - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 24/200
 - 1s - loss: 0.0218 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 25/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 26/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 27/200
 - 1s - loss: 0.0221 - acc: 0.9927 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 28/200
 - 1s - loss: 0.0228 - acc: 0.9920 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 29/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 30/200
 - 1s - loss: 0.0217 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 31/200
 - 1s - loss: 0.0223 - acc: 0.9927 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 32/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 33/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 34/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 35/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 36/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 37/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0218 - acc: 0.9930 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 46/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 47/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 48/200
 - 1s - loss: 0.0216 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 49/200
 - 1s - loss: 0.0219 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 50/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 51/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 52/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 53/200
 - 1s - loss: 0.0211 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 54/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 55/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 56/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 57/200
 - 1s - loss: 0.0216 - acc: 0.9925 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 58/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 59/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 60/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 61/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 62/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 64/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 67/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 68/200
 - 1s - loss: 0.0224 - acc: 0.9920 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 70/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 71/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 72/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9934
Epoch 73/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9936
Epoch 74/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 75/200
 - 1s - loss: 0.0214 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 76/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 77/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 78/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 79/200
 - 1s - loss: 0.0211 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 80/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 81/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 82/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0211 - acc: 0.9930 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 85/200
 - 1s - loss: 0.0213 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 86/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 87/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 88/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 89/200
 - 1s - loss: 0.0209 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 90/200
 - 1s - loss: 0.0213 - acc: 0.9927 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 91/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 92/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 93/200
 - 1s - loss: 0.0214 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 94/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 95/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 96/200
 - 1s - loss: 0.0216 - acc: 0.9927 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 97/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 98/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 99/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9938
Epoch 100/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 101/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 102/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 103/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 104/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 105/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 106/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 107/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 108/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 109/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 110/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 111/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 112/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 113/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 114/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 115/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 116/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 117/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 118/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 119/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0215 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0207 - acc: 0.9934 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 127/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 128/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 129/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0211 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 131/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 132/200
 - 1s - loss: 0.0205 - acc: 0.9939 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 133/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 134/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 135/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 136/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 137/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 138/200
 - 1s - loss: 0.0204 - acc: 0.9928 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 139/200
 - 1s - loss: 0.0215 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 140/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0214 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 142/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 145/200
 - 1s - loss: 0.0212 - acc: 0.9927 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 147/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 148/200
 - 1s - loss: 0.0211 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 149/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 150/200
 - 1s - loss: 0.0205 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 151/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 152/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 153/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 154/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 155/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 156/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 157/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 158/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 159/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 160/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 161/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 162/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 163/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 164/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 165/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 166/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 167/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 168/200
 - 1s - loss: 0.0200 - acc: 0.9932 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 169/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 170/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 171/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 172/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 173/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0213 - val_acc: 0.9938
Epoch 174/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 175/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 177/200
 - 1s - loss: 0.0207 - acc: 0.9930 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 178/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 179/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 180/200
 - 1s - loss: 0.0209 - acc: 0.9929 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 181/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 182/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 183/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 184/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 185/200
 - 1s - loss: 0.0202 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 186/200
 - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 187/200
 - 1s - loss: 0.0203 - acc: 0.9934 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 188/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 189/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 191/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 192/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 193/200
 - 1s - loss: 0.0203 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 194/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 195/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 196/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 197/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 198/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 199/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 200/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0212 - val_acc: 0.9938
2018-03-27 11:09:23,894 [INFO] Evaluate...
2018-03-27 11:09:27,308 [INFO] Done!
2018-03-27 11:09:27,315 [INFO] tpe_transform took 0.003364 seconds
2018-03-27 11:09:27,316 [INFO] TPE using 49/49 trials with best loss 0.011121
2018-03-27 11:09:27,322 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:09:28,329 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0751 - acc: 0.9768 - val_loss: 0.0298 - val_acc: 0.9922
Epoch 2/200
 - 1s - loss: 0.0342 - acc: 0.9902 - val_loss: 0.0245 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0294 - acc: 0.9915 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 4/200
 - 1s - loss: 0.0280 - acc: 0.9913 - val_loss: 0.0213 - val_acc: 0.9936
Epoch 5/200
 - 1s - loss: 0.0263 - acc: 0.9920 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0250 - acc: 0.9919 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 8/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 9/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0223 - acc: 0.9924 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 11/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0182 - val_acc: 0.9950
Epoch 13/200
 - 1s - loss: 0.0216 - acc: 0.9927 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 14/200
 - 1s - loss: 0.0217 - acc: 0.9928 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 15/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 16/200
 - 1s - loss: 0.0207 - acc: 0.9934 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0207 - acc: 0.9934 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 18/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 19/200
 - 1s - loss: 0.0195 - acc: 0.9933 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 20/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 21/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 22/200
 - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 23/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 24/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 25/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 28/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 29/200
 - 1s - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 30/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 31/200
 - 1s - loss: 0.0193 - acc: 0.9939 - val_loss: 0.0172 - val_acc: 0.9954
Epoch 32/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9954
Epoch 33/200
 - 1s - loss: 0.0195 - acc: 0.9931 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 34/200
 - 1s - loss: 0.0196 - acc: 0.9935 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 35/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9954
Epoch 36/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 37/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 38/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 39/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9956
Epoch 40/200
 - 1s - loss: 0.0191 - acc: 0.9934 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 41/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 42/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0169 - val_acc: 0.9956
Epoch 43/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9954
Epoch 44/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 45/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0168 - val_acc: 0.9954
Epoch 46/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 47/200
 - 1s - loss: 0.0187 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 48/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 49/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9956
Epoch 50/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 51/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 52/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 53/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 54/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 55/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 56/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 57/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0167 - val_acc: 0.9956
Epoch 58/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 59/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 60/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 61/200
 - 1s - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 62/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 63/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 64/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 65/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 66/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 67/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9956
Epoch 68/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 69/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 70/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 71/200
 - 1s - loss: 0.0179 - acc: 0.9943 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 72/200
 - 1s - loss: 0.0177 - acc: 0.9941 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 73/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 74/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 75/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 76/200
 - 1s - loss: 0.0182 - acc: 0.9939 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 77/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 78/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 79/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 80/200
 - 1s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 81/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9956
Epoch 82/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 83/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 84/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 85/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 86/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 87/200
 - 1s - loss: 0.0176 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 88/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 89/200
 - 1s - loss: 0.0182 - acc: 0.9938 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 90/200
 - 1s - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 91/200
 - 1s - loss: 0.0171 - acc: 0.9940 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 92/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 93/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0164 - val_acc: 0.9956
Epoch 94/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 95/200
 - 1s - loss: 0.0178 - acc: 0.9941 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 96/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 97/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 98/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 99/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 100/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 101/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 102/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 103/200
 - 1s - loss: 0.0182 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 104/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 105/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 106/200
 - 1s - loss: 0.0180 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 107/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 108/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 109/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 110/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 111/200
 - 1s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 112/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 113/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 114/200
 - 1s - loss: 0.0173 - acc: 0.9941 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 115/200
 - 1s - loss: 0.0164 - acc: 0.9943 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 116/200
 - 1s - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 117/200
 - 1s - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 118/200
 - 1s - loss: 0.0175 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 119/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 120/200
 - 1s - loss: 0.0169 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 121/200
 - 1s - loss: 0.0162 - acc: 0.9945 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 122/200
 - 1s - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 123/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0163 - val_acc: 0.9956
Epoch 124/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 125/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 126/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 127/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 128/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 129/200
 - 1s - loss: 0.0166 - acc: 0.9945 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 130/200
 - 1s - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 131/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 132/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 133/200
 - 1s - loss: 0.0165 - acc: 0.9945 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 134/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 135/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 136/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 137/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 138/200
 - 1s - loss: 0.0165 - acc: 0.9944 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 139/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 140/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 141/200
 - 1s - loss: 0.0176 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 142/200
 - 1s - loss: 0.0175 - acc: 0.9941 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 143/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 144/200
 - 1s - loss: 0.0172 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 145/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 146/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 147/200
 - 1s - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 148/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 149/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 150/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 151/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 152/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9956
Epoch 153/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 154/200
 - 1s - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 155/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 156/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 157/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 158/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 159/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 160/200
 - 1s - loss: 0.0159 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 161/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 162/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 163/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 164/200
 - 1s - loss: 0.0169 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 165/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 166/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 167/200
 - 1s - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 168/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 169/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 170/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 171/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 172/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 173/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 174/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 175/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 176/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 177/200
 - 1s - loss: 0.0158 - acc: 0.9946 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 178/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 179/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 180/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 181/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 182/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 183/200
 - 1s - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 184/200
 - 1s - loss: 0.0174 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 185/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 186/200
 - 1s - loss: 0.0163 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 187/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 188/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 189/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 190/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 191/200
 - 1s - loss: 0.0164 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 192/200
 - 1s - loss: 0.0166 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 193/200
 - 1s - loss: 0.0168 - acc: 0.9943 - val_loss: 0.0161 - val_acc: 0.9956
Epoch 194/200
 - 1s - loss: 0.0166 - acc: 0.9944 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 195/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 196/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 197/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 198/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 199/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0160 - val_acc: 0.9956
Epoch 200/200
 - 1s - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9956
2018-03-27 11:12:40,709 [INFO] Evaluate...
2018-03-27 11:12:44,141 [INFO] Done!
2018-03-27 11:12:44,149 [INFO] tpe_transform took 0.002378 seconds
2018-03-27 11:12:44,149 [INFO] TPE using 50/50 trials with best loss 0.011121
2018-03-27 11:12:44,156 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:12:45,144 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0600 - acc: 0.9774 - val_loss: 0.0297 - val_acc: 0.9922
Epoch 2/200
 - 1s - loss: 0.0286 - acc: 0.9922 - val_loss: 0.0259 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0258 - acc: 0.9927 - val_loss: 0.0244 - val_acc: 0.9932
Epoch 4/200
 - 1s - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9934
Epoch 5/200
 - 1s - loss: 0.0239 - acc: 0.9933 - val_loss: 0.0230 - val_acc: 0.9936
Epoch 6/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9936
Epoch 7/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9936
Epoch 9/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9938
Epoch 10/200
 - 1s - loss: 0.0220 - acc: 0.9941 - val_loss: 0.0214 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0213 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9942
Epoch 13/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0212 - acc: 0.9941 - val_loss: 0.0208 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 17/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 18/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9946
Epoch 20/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 23/200
 - 1s - loss: 0.0204 - acc: 0.9941 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 24/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 25/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 26/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 27/200
 - 1s - loss: 0.0199 - acc: 0.9942 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 28/200
 - 1s - loss: 0.0200 - acc: 0.9942 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0198 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0198 - val_acc: 0.9946
Epoch 31/200
 - 1s - loss: 0.0197 - acc: 0.9945 - val_loss: 0.0198 - val_acc: 0.9946
Epoch 32/200
 - 1s - loss: 0.0199 - acc: 0.9943 - val_loss: 0.0197 - val_acc: 0.9946
Epoch 33/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0197 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0192 - acc: 0.9945 - val_loss: 0.0196 - val_acc: 0.9946
Epoch 35/200
 - 1s - loss: 0.0196 - acc: 0.9943 - val_loss: 0.0196 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9946
Epoch 37/200
 - 1s - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0196 - val_acc: 0.9946
Epoch 38/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9946
Epoch 39/200
 - 1s - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0195 - val_acc: 0.9946
Epoch 40/200
 - 1s - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0195 - val_acc: 0.9946
Epoch 41/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0194 - val_acc: 0.9946
Epoch 42/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0194 - val_acc: 0.9946
Epoch 43/200
 - 1s - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0194 - val_acc: 0.9946
Epoch 44/200
 - 1s - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0194 - val_acc: 0.9946
Epoch 45/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 46/200
 - 1s - loss: 0.0194 - acc: 0.9947 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 47/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 48/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0194 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 51/200
 - 1s - loss: 0.0193 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9948
Epoch 52/200
 - 1s - loss: 0.0190 - acc: 0.9945 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 53/200
 - 1s - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 56/200
 - 1s - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 57/200
 - 1s - loss: 0.0186 - acc: 0.9947 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0191 - acc: 0.9943 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 59/200
 - 1s - loss: 0.0189 - acc: 0.9943 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 60/200
 - 1s - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9950
Epoch 61/200
 - 1s - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 62/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 63/200
 - 1s - loss: 0.0192 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 64/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 65/200
 - 1s - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 66/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 67/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 68/200
 - 1s - loss: 0.0188 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9950
Epoch 69/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 70/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 71/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 72/200
 - 1s - loss: 0.0186 - acc: 0.9947 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 73/200
 - 1s - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 74/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 75/200
 - 1s - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 76/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9950
Epoch 77/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 78/200
 - 1s - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 79/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 80/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 81/200
 - 1s - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 82/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9952
Epoch 83/200
 - 1s - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9952
Epoch 84/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9952
Epoch 85/200
 - 1s - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0188 - val_acc: 0.9950
Epoch 86/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9952
Epoch 87/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 88/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 89/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 90/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 91/200
 - 1s - loss: 0.0183 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 92/200
 - 1s - loss: 0.0188 - acc: 0.9943 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 93/200
 - 1s - loss: 0.0185 - acc: 0.9948 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 94/200
 - 1s - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 95/200
 - 1s - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 96/200
 - 1s - loss: 0.0183 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 97/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0187 - val_acc: 0.9952
Epoch 98/200
 - 1s - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 99/200
 - 1s - loss: 0.0181 - acc: 0.9950 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 100/200
 - 1s - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 101/200
 - 1s - loss: 0.0183 - acc: 0.9949 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 102/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 103/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 104/200
 - 1s - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 105/200
 - 1s - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 106/200
 - 1s - loss: 0.0186 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 107/200
 - 1s - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 108/200
 - 1s - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 109/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 110/200
 - 1s - loss: 0.0183 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9952
Epoch 111/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 112/200
 - 1s - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 113/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 114/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 115/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 116/200
 - 1s - loss: 0.0182 - acc: 0.9949 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 117/200
 - 1s - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 118/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 119/200
 - 1s - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 120/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 121/200
 - 1s - loss: 0.0181 - acc: 0.9949 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 122/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 123/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 124/200
 - 1s - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 125/200
 - 1s - loss: 0.0184 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 126/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9952
Epoch 127/200
 - 1s - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 128/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 129/200
 - 1s - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 130/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 131/200
 - 1s - loss: 0.0181 - acc: 0.9948 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 132/200
 - 1s - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 133/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 134/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 135/200
 - 1s - loss: 0.0183 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 136/200
 - 1s - loss: 0.0183 - acc: 0.9949 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 137/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 138/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 139/200
 - 1s - loss: 0.0180 - acc: 0.9948 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 140/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 141/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 142/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 143/200
 - 1s - loss: 0.0179 - acc: 0.9953 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 144/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 145/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9952
Epoch 146/200
 - 1s - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 147/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 148/200
 - 1s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 149/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 150/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 151/200
 - 1s - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 152/200
 - 1s - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 153/200
 - 1s - loss: 0.0179 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 154/200
 - 1s - loss: 0.0178 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 155/200
 - 1s - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 156/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 157/200
 - 1s - loss: 0.0179 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 158/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 159/200
 - 1s - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 160/200
 - 1s - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 161/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 162/200
 - 1s - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 163/200
 - 1s - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 164/200
 - 1s - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 165/200
 - 1s - loss: 0.0177 - acc: 0.9953 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 166/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 167/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9952
Epoch 168/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 169/200
 - 1s - loss: 0.0179 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 170/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 171/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 172/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 173/200
 - 1s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 174/200
 - 1s - loss: 0.0179 - acc: 0.9948 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 175/200
 - 1s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 176/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 177/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 178/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 179/200
 - 1s - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 180/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 181/200
 - 1s - loss: 0.0177 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 182/200
 - 1s - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 183/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 184/200
 - 1s - loss: 0.0178 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 185/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 186/200
 - 1s - loss: 0.0179 - acc: 0.9948 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 187/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 188/200
 - 1s - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 189/200
 - 1s - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 190/200
 - 1s - loss: 0.0180 - acc: 0.9948 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 191/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 192/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 193/200
 - 1s - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 194/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9952
Epoch 195/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 196/200
 - 1s - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 197/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 198/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 199/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9952
Epoch 200/200
 - 1s - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9952
2018-03-27 11:15:58,508 [INFO] Evaluate...
2018-03-27 11:16:02,002 [INFO] Done!
2018-03-27 11:16:02,009 [INFO] tpe_transform took 0.002518 seconds
2018-03-27 11:16:02,010 [INFO] TPE using 51/51 trials with best loss 0.011121
2018-03-27 11:16:02,018 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:16:03,006 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.1424 - acc: 0.9610 - val_loss: 0.0727 - val_acc: 0.9870
Epoch 2/200
 - 1s - loss: 0.0810 - acc: 0.9809 - val_loss: 0.0623 - val_acc: 0.9878
Epoch 3/200
 - 1s - loss: 0.0724 - acc: 0.9837 - val_loss: 0.0579 - val_acc: 0.9876
Epoch 4/200
 - 1s - loss: 0.0677 - acc: 0.9842 - val_loss: 0.0553 - val_acc: 0.9878
Epoch 5/200
 - 1s - loss: 0.0653 - acc: 0.9841 - val_loss: 0.0535 - val_acc: 0.9880
Epoch 6/200
 - 1s - loss: 0.0626 - acc: 0.9861 - val_loss: 0.0522 - val_acc: 0.9880
Epoch 7/200
 - 1s - loss: 0.0607 - acc: 0.9853 - val_loss: 0.0511 - val_acc: 0.9882
Epoch 8/200
 - 1s - loss: 0.0593 - acc: 0.9848 - val_loss: 0.0503 - val_acc: 0.9880
Epoch 9/200
 - 1s - loss: 0.0598 - acc: 0.9849 - val_loss: 0.0496 - val_acc: 0.9882
Epoch 10/200
 - 1s - loss: 0.0588 - acc: 0.9863 - val_loss: 0.0489 - val_acc: 0.9884
Epoch 11/200
 - 1s - loss: 0.0572 - acc: 0.9861 - val_loss: 0.0484 - val_acc: 0.9888
Epoch 12/200
 - 1s - loss: 0.0580 - acc: 0.9861 - val_loss: 0.0480 - val_acc: 0.9890
Epoch 13/200
 - 1s - loss: 0.0568 - acc: 0.9851 - val_loss: 0.0475 - val_acc: 0.9890
Epoch 14/200
 - 1s - loss: 0.0558 - acc: 0.9863 - val_loss: 0.0471 - val_acc: 0.9890
Epoch 15/200
 - 1s - loss: 0.0544 - acc: 0.9867 - val_loss: 0.0468 - val_acc: 0.9892
Epoch 16/200
 - 1s - loss: 0.0551 - acc: 0.9868 - val_loss: 0.0465 - val_acc: 0.9892
Epoch 17/200
 - 1s - loss: 0.0554 - acc: 0.9864 - val_loss: 0.0462 - val_acc: 0.9892
Epoch 18/200
 - 1s - loss: 0.0557 - acc: 0.9855 - val_loss: 0.0460 - val_acc: 0.9892
Epoch 19/200
 - 1s - loss: 0.0553 - acc: 0.9850 - val_loss: 0.0457 - val_acc: 0.9892
Epoch 20/200
 - 1s - loss: 0.0536 - acc: 0.9856 - val_loss: 0.0455 - val_acc: 0.9892
Epoch 21/200
 - 1s - loss: 0.0529 - acc: 0.9867 - val_loss: 0.0453 - val_acc: 0.9892
Epoch 22/200
 - 1s - loss: 0.0532 - acc: 0.9872 - val_loss: 0.0451 - val_acc: 0.9892
Epoch 23/200
 - 1s - loss: 0.0526 - acc: 0.9870 - val_loss: 0.0449 - val_acc: 0.9892
Epoch 24/200
 - 1s - loss: 0.0518 - acc: 0.9876 - val_loss: 0.0448 - val_acc: 0.9894
Epoch 25/200
 - 1s - loss: 0.0532 - acc: 0.9859 - val_loss: 0.0446 - val_acc: 0.9894
Epoch 26/200
 - 1s - loss: 0.0527 - acc: 0.9869 - val_loss: 0.0444 - val_acc: 0.9894
Epoch 27/200
 - 1s - loss: 0.0515 - acc: 0.9868 - val_loss: 0.0443 - val_acc: 0.9894
Epoch 28/200
 - 1s - loss: 0.0527 - acc: 0.9859 - val_loss: 0.0441 - val_acc: 0.9894
Epoch 29/200
 - 1s - loss: 0.0520 - acc: 0.9864 - val_loss: 0.0440 - val_acc: 0.9894
Epoch 30/200
 - 1s - loss: 0.0519 - acc: 0.9867 - val_loss: 0.0439 - val_acc: 0.9894
Epoch 31/200
 - 1s - loss: 0.0517 - acc: 0.9874 - val_loss: 0.0437 - val_acc: 0.9894
Epoch 32/200
 - 1s - loss: 0.0509 - acc: 0.9873 - val_loss: 0.0436 - val_acc: 0.9894
Epoch 33/200
 - 1s - loss: 0.0518 - acc: 0.9864 - val_loss: 0.0435 - val_acc: 0.9894
Epoch 34/200
 - 1s - loss: 0.0509 - acc: 0.9871 - val_loss: 0.0434 - val_acc: 0.9896
Epoch 35/200
 - 1s - loss: 0.0520 - acc: 0.9860 - val_loss: 0.0433 - val_acc: 0.9896
Epoch 36/200
 - 1s - loss: 0.0513 - acc: 0.9859 - val_loss: 0.0432 - val_acc: 0.9896
Epoch 37/200
 - 1s - loss: 0.0509 - acc: 0.9870 - val_loss: 0.0431 - val_acc: 0.9898
Epoch 38/200
 - 1s - loss: 0.0503 - acc: 0.9870 - val_loss: 0.0430 - val_acc: 0.9896
Epoch 39/200
 - 1s - loss: 0.0502 - acc: 0.9878 - val_loss: 0.0429 - val_acc: 0.9896
Epoch 40/200
 - 1s - loss: 0.0507 - acc: 0.9870 - val_loss: 0.0428 - val_acc: 0.9896
Epoch 41/200
 - 1s - loss: 0.0485 - acc: 0.9888 - val_loss: 0.0427 - val_acc: 0.9896
Epoch 42/200
 - 1s - loss: 0.0518 - acc: 0.9864 - val_loss: 0.0426 - val_acc: 0.9896
Epoch 43/200
 - 1s - loss: 0.0500 - acc: 0.9876 - val_loss: 0.0425 - val_acc: 0.9896
Epoch 44/200
 - 1s - loss: 0.0490 - acc: 0.9882 - val_loss: 0.0425 - val_acc: 0.9896
Epoch 45/200
 - 1s - loss: 0.0496 - acc: 0.9877 - val_loss: 0.0424 - val_acc: 0.9896
Epoch 46/200
 - 1s - loss: 0.0501 - acc: 0.9874 - val_loss: 0.0423 - val_acc: 0.9896
Epoch 47/200
 - 1s - loss: 0.0497 - acc: 0.9870 - val_loss: 0.0423 - val_acc: 0.9896
Epoch 48/200
 - 1s - loss: 0.0496 - acc: 0.9864 - val_loss: 0.0422 - val_acc: 0.9896
Epoch 49/200
 - 1s - loss: 0.0496 - acc: 0.9870 - val_loss: 0.0421 - val_acc: 0.9896
Epoch 50/200
 - 1s - loss: 0.0500 - acc: 0.9867 - val_loss: 0.0421 - val_acc: 0.9898
Epoch 51/200
 - 1s - loss: 0.0479 - acc: 0.9878 - val_loss: 0.0420 - val_acc: 0.9898
Epoch 52/200
 - 1s - loss: 0.0497 - acc: 0.9874 - val_loss: 0.0419 - val_acc: 0.9898
Epoch 53/200
 - 1s - loss: 0.0480 - acc: 0.9881 - val_loss: 0.0419 - val_acc: 0.9898
Epoch 54/200
 - 1s - loss: 0.0506 - acc: 0.9875 - val_loss: 0.0418 - val_acc: 0.9898
Epoch 55/200
 - 1s - loss: 0.0492 - acc: 0.9877 - val_loss: 0.0417 - val_acc: 0.9896
Epoch 56/200
 - 1s - loss: 0.0483 - acc: 0.9869 - val_loss: 0.0417 - val_acc: 0.9896
Epoch 57/200
 - 1s - loss: 0.0490 - acc: 0.9875 - val_loss: 0.0416 - val_acc: 0.9896
Epoch 58/200
 - 1s - loss: 0.0492 - acc: 0.9869 - val_loss: 0.0416 - val_acc: 0.9896
Epoch 59/200
 - 1s - loss: 0.0479 - acc: 0.9882 - val_loss: 0.0415 - val_acc: 0.9896
Epoch 60/200
 - 1s - loss: 0.0481 - acc: 0.9873 - val_loss: 0.0415 - val_acc: 0.9896
Epoch 61/200
 - 1s - loss: 0.0486 - acc: 0.9869 - val_loss: 0.0414 - val_acc: 0.9896
Epoch 62/200
 - 1s - loss: 0.0489 - acc: 0.9866 - val_loss: 0.0414 - val_acc: 0.9896
Epoch 63/200
 - 1s - loss: 0.0477 - acc: 0.9879 - val_loss: 0.0413 - val_acc: 0.9896
Epoch 64/200
 - 1s - loss: 0.0491 - acc: 0.9874 - val_loss: 0.0413 - val_acc: 0.9896
Epoch 65/200
 - 1s - loss: 0.0493 - acc: 0.9867 - val_loss: 0.0412 - val_acc: 0.9896
Epoch 66/200
 - 1s - loss: 0.0486 - acc: 0.9872 - val_loss: 0.0412 - val_acc: 0.9896
Epoch 67/200
 - 1s - loss: 0.0490 - acc: 0.9870 - val_loss: 0.0411 - val_acc: 0.9898
Epoch 68/200
 - 1s - loss: 0.0479 - acc: 0.9882 - val_loss: 0.0411 - val_acc: 0.9898
Epoch 69/200
 - 1s - loss: 0.0483 - acc: 0.9872 - val_loss: 0.0411 - val_acc: 0.9898
Epoch 70/200
 - 1s - loss: 0.0478 - acc: 0.9882 - val_loss: 0.0410 - val_acc: 0.9898
Epoch 71/200
 - 1s - loss: 0.0485 - acc: 0.9870 - val_loss: 0.0410 - val_acc: 0.9898
Epoch 72/200
 - 1s - loss: 0.0476 - acc: 0.9876 - val_loss: 0.0409 - val_acc: 0.9898
Epoch 73/200
 - 1s - loss: 0.0481 - acc: 0.9877 - val_loss: 0.0409 - val_acc: 0.9898
Epoch 74/200
 - 1s - loss: 0.0470 - acc: 0.9889 - val_loss: 0.0409 - val_acc: 0.9898
Epoch 75/200
 - 1s - loss: 0.0481 - acc: 0.9878 - val_loss: 0.0408 - val_acc: 0.9898
Epoch 76/200
 - 1s - loss: 0.0482 - acc: 0.9874 - val_loss: 0.0408 - val_acc: 0.9898
Epoch 77/200
 - 1s - loss: 0.0470 - acc: 0.9879 - val_loss: 0.0407 - val_acc: 0.9898
Epoch 78/200
 - 1s - loss: 0.0484 - acc: 0.9871 - val_loss: 0.0407 - val_acc: 0.9898
Epoch 79/200
 - 1s - loss: 0.0482 - acc: 0.9875 - val_loss: 0.0407 - val_acc: 0.9898
Epoch 80/200
 - 1s - loss: 0.0475 - acc: 0.9873 - val_loss: 0.0406 - val_acc: 0.9898
Epoch 81/200
 - 1s - loss: 0.0472 - acc: 0.9880 - val_loss: 0.0406 - val_acc: 0.9898
Epoch 82/200
 - 1s - loss: 0.0477 - acc: 0.9873 - val_loss: 0.0406 - val_acc: 0.9898
Epoch 83/200
 - 1s - loss: 0.0474 - acc: 0.9875 - val_loss: 0.0405 - val_acc: 0.9898
Epoch 84/200
 - 1s - loss: 0.0479 - acc: 0.9874 - val_loss: 0.0405 - val_acc: 0.9898
Epoch 85/200
 - 1s - loss: 0.0481 - acc: 0.9879 - val_loss: 0.0405 - val_acc: 0.9898
Epoch 86/200
 - 1s - loss: 0.0469 - acc: 0.9882 - val_loss: 0.0404 - val_acc: 0.9898
Epoch 87/200
 - 1s - loss: 0.0480 - acc: 0.9875 - val_loss: 0.0404 - val_acc: 0.9898
Epoch 88/200
 - 1s - loss: 0.0477 - acc: 0.9878 - val_loss: 0.0404 - val_acc: 0.9898
Epoch 89/200
 - 1s - loss: 0.0470 - acc: 0.9879 - val_loss: 0.0403 - val_acc: 0.9898
Epoch 90/200
 - 1s - loss: 0.0471 - acc: 0.9877 - val_loss: 0.0403 - val_acc: 0.9898
Epoch 91/200
 - 1s - loss: 0.0468 - acc: 0.9882 - val_loss: 0.0403 - val_acc: 0.9898
Epoch 92/200
 - 1s - loss: 0.0464 - acc: 0.9882 - val_loss: 0.0402 - val_acc: 0.9898
Epoch 93/200
 - 1s - loss: 0.0469 - acc: 0.9877 - val_loss: 0.0402 - val_acc: 0.9898
Epoch 94/200
 - 1s - loss: 0.0463 - acc: 0.9872 - val_loss: 0.0402 - val_acc: 0.9898
Epoch 95/200
 - 1s - loss: 0.0467 - acc: 0.9879 - val_loss: 0.0402 - val_acc: 0.9898
Epoch 96/200
 - 1s - loss: 0.0471 - acc: 0.9874 - val_loss: 0.0401 - val_acc: 0.9898
Epoch 97/200
 - 1s - loss: 0.0462 - acc: 0.9872 - val_loss: 0.0401 - val_acc: 0.9898
Epoch 98/200
 - 1s - loss: 0.0468 - acc: 0.9882 - val_loss: 0.0401 - val_acc: 0.9898
Epoch 99/200
 - 1s - loss: 0.0473 - acc: 0.9870 - val_loss: 0.0400 - val_acc: 0.9898
Epoch 100/200
 - 1s - loss: 0.0458 - acc: 0.9884 - val_loss: 0.0400 - val_acc: 0.9898
Epoch 101/200
 - 1s - loss: 0.0461 - acc: 0.9883 - val_loss: 0.0400 - val_acc: 0.9898
Epoch 102/200
 - 1s - loss: 0.0469 - acc: 0.9878 - val_loss: 0.0400 - val_acc: 0.9898
Epoch 103/200
 - 1s - loss: 0.0476 - acc: 0.9876 - val_loss: 0.0399 - val_acc: 0.9898
Epoch 104/200
 - 1s - loss: 0.0474 - acc: 0.9864 - val_loss: 0.0399 - val_acc: 0.9898
Epoch 105/200
 - 1s - loss: 0.0464 - acc: 0.9879 - val_loss: 0.0399 - val_acc: 0.9898
Epoch 106/200
 - 1s - loss: 0.0472 - acc: 0.9876 - val_loss: 0.0399 - val_acc: 0.9898
Epoch 107/200
 - 1s - loss: 0.0464 - acc: 0.9881 - val_loss: 0.0398 - val_acc: 0.9898
Epoch 108/200
 - 1s - loss: 0.0466 - acc: 0.9879 - val_loss: 0.0398 - val_acc: 0.9898
Epoch 109/200
 - 1s - loss: 0.0456 - acc: 0.9879 - val_loss: 0.0398 - val_acc: 0.9898
Epoch 110/200
 - 1s - loss: 0.0468 - acc: 0.9877 - val_loss: 0.0398 - val_acc: 0.9898
Epoch 111/200
 - 1s - loss: 0.0468 - acc: 0.9876 - val_loss: 0.0397 - val_acc: 0.9898
Epoch 112/200
 - 1s - loss: 0.0455 - acc: 0.9883 - val_loss: 0.0397 - val_acc: 0.9898
Epoch 113/200
 - 1s - loss: 0.0462 - acc: 0.9880 - val_loss: 0.0397 - val_acc: 0.9898
Epoch 114/200
 - 1s - loss: 0.0468 - acc: 0.9873 - val_loss: 0.0397 - val_acc: 0.9898
Epoch 115/200
 - 1s - loss: 0.0462 - acc: 0.9881 - val_loss: 0.0397 - val_acc: 0.9898
Epoch 116/200
 - 1s - loss: 0.0460 - acc: 0.9879 - val_loss: 0.0396 - val_acc: 0.9898
Epoch 117/200
 - 1s - loss: 0.0467 - acc: 0.9879 - val_loss: 0.0396 - val_acc: 0.9898
Epoch 118/200
 - 1s - loss: 0.0464 - acc: 0.9881 - val_loss: 0.0396 - val_acc: 0.9898
Epoch 119/200
 - 1s - loss: 0.0465 - acc: 0.9885 - val_loss: 0.0396 - val_acc: 0.9898
Epoch 120/200
 - 1s - loss: 0.0461 - acc: 0.9878 - val_loss: 0.0395 - val_acc: 0.9898
Epoch 121/200
 - 1s - loss: 0.0461 - acc: 0.9874 - val_loss: 0.0395 - val_acc: 0.9898
Epoch 122/200
 - 1s - loss: 0.0456 - acc: 0.9882 - val_loss: 0.0395 - val_acc: 0.9898
Epoch 123/200
 - 1s - loss: 0.0455 - acc: 0.9877 - val_loss: 0.0395 - val_acc: 0.9898
Epoch 124/200
 - 1s - loss: 0.0473 - acc: 0.9879 - val_loss: 0.0395 - val_acc: 0.9898
Epoch 125/200
 - 1s - loss: 0.0463 - acc: 0.9879 - val_loss: 0.0394 - val_acc: 0.9898
Epoch 126/200
 - 1s - loss: 0.0458 - acc: 0.9877 - val_loss: 0.0394 - val_acc: 0.9898
Epoch 127/200
 - 1s - loss: 0.0460 - acc: 0.9880 - val_loss: 0.0394 - val_acc: 0.9898
Epoch 128/200
 - 1s - loss: 0.0463 - acc: 0.9881 - val_loss: 0.0394 - val_acc: 0.9898
Epoch 129/200
 - 1s - loss: 0.0454 - acc: 0.9880 - val_loss: 0.0394 - val_acc: 0.9898
Epoch 130/200
 - 1s - loss: 0.0453 - acc: 0.9890 - val_loss: 0.0393 - val_acc: 0.9898
Epoch 131/200
 - 1s - loss: 0.0457 - acc: 0.9877 - val_loss: 0.0393 - val_acc: 0.9898
Epoch 132/200
 - 1s - loss: 0.0470 - acc: 0.9877 - val_loss: 0.0393 - val_acc: 0.9898
Epoch 133/200
 - 1s - loss: 0.0453 - acc: 0.9878 - val_loss: 0.0393 - val_acc: 0.9898
Epoch 134/200
 - 1s - loss: 0.0452 - acc: 0.9882 - val_loss: 0.0393 - val_acc: 0.9898
Epoch 135/200
 - 1s - loss: 0.0463 - acc: 0.9882 - val_loss: 0.0393 - val_acc: 0.9898
Epoch 136/200
 - 1s - loss: 0.0462 - acc: 0.9879 - val_loss: 0.0392 - val_acc: 0.9898
Epoch 137/200
 - 1s - loss: 0.0461 - acc: 0.9887 - val_loss: 0.0392 - val_acc: 0.9898
Epoch 138/200
 - 1s - loss: 0.0464 - acc: 0.9876 - val_loss: 0.0392 - val_acc: 0.9898
Epoch 139/200
 - 1s - loss: 0.0458 - acc: 0.9883 - val_loss: 0.0392 - val_acc: 0.9898
Epoch 140/200
 - 1s - loss: 0.0455 - acc: 0.9883 - val_loss: 0.0392 - val_acc: 0.9898
Epoch 141/200
 - 1s - loss: 0.0456 - acc: 0.9890 - val_loss: 0.0391 - val_acc: 0.9898
Epoch 142/200
 - 1s - loss: 0.0443 - acc: 0.9899 - val_loss: 0.0391 - val_acc: 0.9898
Epoch 143/200
 - 1s - loss: 0.0456 - acc: 0.9882 - val_loss: 0.0391 - val_acc: 0.9898
Epoch 144/200
 - 1s - loss: 0.0459 - acc: 0.9879 - val_loss: 0.0391 - val_acc: 0.9898
Epoch 145/200
 - 1s - loss: 0.0456 - acc: 0.9881 - val_loss: 0.0391 - val_acc: 0.9898
Epoch 146/200
 - 1s - loss: 0.0463 - acc: 0.9875 - val_loss: 0.0391 - val_acc: 0.9898
Epoch 147/200
 - 1s - loss: 0.0460 - acc: 0.9874 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 148/200
 - 1s - loss: 0.0459 - acc: 0.9881 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 149/200
 - 1s - loss: 0.0443 - acc: 0.9881 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 150/200
 - 1s - loss: 0.0449 - acc: 0.9881 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 151/200
 - 1s - loss: 0.0445 - acc: 0.9890 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 152/200
 - 1s - loss: 0.0449 - acc: 0.9879 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 153/200
 - 1s - loss: 0.0440 - acc: 0.9888 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 154/200
 - 1s - loss: 0.0454 - acc: 0.9883 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 155/200
 - 1s - loss: 0.0456 - acc: 0.9885 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 156/200
 - 1s - loss: 0.0447 - acc: 0.9883 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 157/200
 - 1s - loss: 0.0456 - acc: 0.9883 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 158/200
 - 1s - loss: 0.0447 - acc: 0.9887 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 159/200
 - 1s - loss: 0.0451 - acc: 0.9883 - val_loss: 0.0389 - val_acc: 0.9898
Epoch 160/200
 - 1s - loss: 0.0459 - acc: 0.9887 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 161/200
 - 1s - loss: 0.0460 - acc: 0.9876 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 162/200
 - 1s - loss: 0.0454 - acc: 0.9879 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 163/200
 - 1s - loss: 0.0446 - acc: 0.9884 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 164/200
 - 1s - loss: 0.0444 - acc: 0.9886 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 165/200
 - 1s - loss: 0.0458 - acc: 0.9869 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 166/200
 - 1s - loss: 0.0443 - acc: 0.9879 - val_loss: 0.0388 - val_acc: 0.9898
Epoch 167/200
 - 1s - loss: 0.0458 - acc: 0.9870 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 168/200
 - 1s - loss: 0.0452 - acc: 0.9880 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 169/200
 - 1s - loss: 0.0446 - acc: 0.9890 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 170/200
 - 1s - loss: 0.0458 - acc: 0.9883 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 171/200
 - 1s - loss: 0.0454 - acc: 0.9877 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 172/200
 - 1s - loss: 0.0454 - acc: 0.9881 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 173/200
 - 1s - loss: 0.0455 - acc: 0.9883 - val_loss: 0.0387 - val_acc: 0.9898
Epoch 174/200
 - 1s - loss: 0.0461 - acc: 0.9877 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 175/200
 - 1s - loss: 0.0453 - acc: 0.9879 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 176/200
 - 1s - loss: 0.0455 - acc: 0.9881 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 177/200
 - 1s - loss: 0.0454 - acc: 0.9879 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 178/200
 - 1s - loss: 0.0451 - acc: 0.9876 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 179/200
 - 1s - loss: 0.0440 - acc: 0.9888 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 180/200
 - 1s - loss: 0.0449 - acc: 0.9884 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 181/200
 - 1s - loss: 0.0447 - acc: 0.9878 - val_loss: 0.0386 - val_acc: 0.9898
Epoch 182/200
 - 1s - loss: 0.0447 - acc: 0.9886 - val_loss: 0.0385 - val_acc: 0.9898
Epoch 183/200
 - 1s - loss: 0.0453 - acc: 0.9877 - val_loss: 0.0385 - val_acc: 0.9898
Epoch 184/200
 - 1s - loss: 0.0443 - acc: 0.9880 - val_loss: 0.0385 - val_acc: 0.9898
Epoch 185/200
 - 1s - loss: 0.0451 - acc: 0.9877 - val_loss: 0.0385 - val_acc: 0.9900
Epoch 186/200
 - 1s - loss: 0.0446 - acc: 0.9879 - val_loss: 0.0385 - val_acc: 0.9900
Epoch 187/200
 - 1s - loss: 0.0455 - acc: 0.9885 - val_loss: 0.0385 - val_acc: 0.9900
Epoch 188/200
 - 1s - loss: 0.0449 - acc: 0.9881 - val_loss: 0.0385 - val_acc: 0.9900
Epoch 189/200
 - 1s - loss: 0.0444 - acc: 0.9886 - val_loss: 0.0385 - val_acc: 0.9900
Epoch 190/200
 - 1s - loss: 0.0450 - acc: 0.9879 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 191/200
 - 1s - loss: 0.0446 - acc: 0.9881 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 192/200
 - 1s - loss: 0.0449 - acc: 0.9879 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 193/200
 - 1s - loss: 0.0446 - acc: 0.9888 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 194/200
 - 1s - loss: 0.0439 - acc: 0.9889 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 195/200
 - 1s - loss: 0.0442 - acc: 0.9884 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 196/200
 - 1s - loss: 0.0453 - acc: 0.9876 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 197/200
 - 1s - loss: 0.0438 - acc: 0.9883 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 198/200
 - 1s - loss: 0.0441 - acc: 0.9879 - val_loss: 0.0384 - val_acc: 0.9900
Epoch 199/200
 - 1s - loss: 0.0442 - acc: 0.9879 - val_loss: 0.0383 - val_acc: 0.9900
Epoch 200/200
 - 1s - loss: 0.0453 - acc: 0.9881 - val_loss: 0.0383 - val_acc: 0.9900
2018-03-27 11:19:17,023 [INFO] Evaluate...
2018-03-27 11:19:20,549 [INFO] Done!
2018-03-27 11:19:20,556 [INFO] tpe_transform took 0.003351 seconds
2018-03-27 11:19:20,557 [INFO] TPE using 52/52 trials with best loss 0.011121
2018-03-27 11:19:20,565 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:19:22,140 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0427 - acc: 0.9841 - val_loss: 0.0228 - val_acc: 0.9926
Epoch 2/200
 - 1s - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 4/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9932
Epoch 5/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0199 - val_acc: 0.9932
Epoch 6/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9932
Epoch 7/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 8/200
 - 1s - loss: 0.0192 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9934
Epoch 10/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9934
Epoch 11/200
 - 1s - loss: 0.0186 - acc: 0.9939 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 12/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 13/200
 - 1s - loss: 0.0178 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9936
Epoch 14/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9936
Epoch 15/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9938
Epoch 16/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9938
Epoch 18/200
 - 1s - loss: 0.0181 - acc: 0.9939 - val_loss: 0.0188 - val_acc: 0.9938
Epoch 19/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9938
Epoch 20/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9936
Epoch 21/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9938
Epoch 22/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9938
Epoch 23/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0186 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9938
Epoch 25/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9936
Epoch 26/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0186 - val_acc: 0.9936
Epoch 27/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0185 - val_acc: 0.9936
Epoch 28/200
 - 1s - loss: 0.0176 - acc: 0.9948 - val_loss: 0.0185 - val_acc: 0.9936
Epoch 29/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9936
Epoch 30/200
 - 1s - loss: 0.0175 - acc: 0.9949 - val_loss: 0.0185 - val_acc: 0.9936
Epoch 31/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0185 - val_acc: 0.9936
Epoch 32/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 33/200
 - 1s - loss: 0.0175 - acc: 0.9943 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 34/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 35/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 36/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 37/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 38/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 39/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 40/200
 - 1s - loss: 0.0177 - acc: 0.9943 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 41/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 42/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 43/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 44/200
 - 1s - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 45/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 46/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 47/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 48/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 49/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 50/200
 - 1s - loss: 0.0173 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 51/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 52/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 53/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 54/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 55/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 56/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 57/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 59/200
 - 1s - loss: 0.0171 - acc: 0.9953 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 60/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 61/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 62/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 63/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 64/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 65/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 66/200
 - 1s - loss: 0.0171 - acc: 0.9942 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 67/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 68/200
 - 1s - loss: 0.0167 - acc: 0.9945 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 69/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 70/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 71/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 72/200
 - 1s - loss: 0.0169 - acc: 0.9952 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 73/200
 - 1s - loss: 0.0166 - acc: 0.9942 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 74/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9936
Epoch 75/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 76/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 77/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 78/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 79/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 80/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 81/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 82/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 85/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 86/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 87/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 88/200
 - 1s - loss: 0.0168 - acc: 0.9948 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 89/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 90/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9938
Epoch 91/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 92/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9936
Epoch 93/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 94/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 95/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 96/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 97/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 98/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 99/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 100/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 101/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 102/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 103/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 104/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 105/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 106/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 107/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 108/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 109/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 110/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 111/200
 - 1s - loss: 0.0165 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 112/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 113/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 114/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 115/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 116/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 117/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 118/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 119/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 127/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 128/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 129/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 131/200
 - 1s - loss: 0.0160 - acc: 0.9948 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 132/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 133/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 134/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 135/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 136/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 137/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 138/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 139/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 140/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 142/200
 - 1s - loss: 0.0162 - acc: 0.9955 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 145/200
 - 1s - loss: 0.0165 - acc: 0.9945 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 147/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 148/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 149/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 150/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 151/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 152/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 153/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 154/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 155/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 156/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 157/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 158/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 159/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 160/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 161/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 162/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 163/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 164/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 165/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 166/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 167/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 168/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 169/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 170/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 171/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 172/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 173/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 174/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 175/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0160 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 177/200
 - 1s - loss: 0.0163 - acc: 0.9945 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 178/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 179/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 180/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 181/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 182/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 183/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 184/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 185/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 186/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 187/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 188/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 189/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 191/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 192/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 193/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 194/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 195/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 196/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 197/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 198/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 199/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 200/200
 - 1s - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9938
2018-03-27 11:22:35,670 [INFO] Evaluate...
2018-03-27 11:22:39,266 [INFO] Done!
2018-03-27 11:22:39,273 [INFO] tpe_transform took 0.003194 seconds
2018-03-27 11:22:39,274 [INFO] TPE using 53/53 trials with best loss 0.011121
2018-03-27 11:22:39,281 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:22:40,269 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.1244 - acc: 0.9701 - val_loss: 0.0742 - val_acc: 0.9856
Epoch 2/200
 - 1s - loss: 0.0715 - acc: 0.9843 - val_loss: 0.0642 - val_acc: 0.9868
Epoch 3/200
 - 1s - loss: 0.0650 - acc: 0.9845 - val_loss: 0.0598 - val_acc: 0.9872
Epoch 4/200
 - 1s - loss: 0.0609 - acc: 0.9854 - val_loss: 0.0572 - val_acc: 0.9872
Epoch 5/200
 - 1s - loss: 0.0595 - acc: 0.9860 - val_loss: 0.0554 - val_acc: 0.9878
Epoch 6/200
 - 1s - loss: 0.0574 - acc: 0.9865 - val_loss: 0.0541 - val_acc: 0.9880
Epoch 7/200
 - 1s - loss: 0.0552 - acc: 0.9877 - val_loss: 0.0530 - val_acc: 0.9886
Epoch 8/200
 - 1s - loss: 0.0540 - acc: 0.9877 - val_loss: 0.0522 - val_acc: 0.9888
Epoch 9/200
 - 1s - loss: 0.0545 - acc: 0.9872 - val_loss: 0.0514 - val_acc: 0.9888
Epoch 10/200
 - 1s - loss: 0.0536 - acc: 0.9877 - val_loss: 0.0508 - val_acc: 0.9888
Epoch 11/200
 - 1s - loss: 0.0530 - acc: 0.9874 - val_loss: 0.0503 - val_acc: 0.9888
Epoch 12/200
 - 1s - loss: 0.0513 - acc: 0.9883 - val_loss: 0.0498 - val_acc: 0.9888
Epoch 13/200
 - 1s - loss: 0.0519 - acc: 0.9872 - val_loss: 0.0494 - val_acc: 0.9888
Epoch 14/200
 - 1s - loss: 0.0501 - acc: 0.9878 - val_loss: 0.0490 - val_acc: 0.9888
Epoch 15/200
 - 1s - loss: 0.0502 - acc: 0.9881 - val_loss: 0.0486 - val_acc: 0.9888
Epoch 16/200
 - 1s - loss: 0.0499 - acc: 0.9878 - val_loss: 0.0483 - val_acc: 0.9888
Epoch 17/200
 - 1s - loss: 0.0505 - acc: 0.9870 - val_loss: 0.0480 - val_acc: 0.9888
Epoch 18/200
 - 1s - loss: 0.0494 - acc: 0.9887 - val_loss: 0.0477 - val_acc: 0.9888
Epoch 19/200
 - 1s - loss: 0.0492 - acc: 0.9882 - val_loss: 0.0475 - val_acc: 0.9890
Epoch 20/200
 - 1s - loss: 0.0488 - acc: 0.9881 - val_loss: 0.0473 - val_acc: 0.9890
Epoch 21/200
 - 1s - loss: 0.0495 - acc: 0.9882 - val_loss: 0.0470 - val_acc: 0.9890
Epoch 22/200
 - 1s - loss: 0.0483 - acc: 0.9883 - val_loss: 0.0468 - val_acc: 0.9890
Epoch 23/200
 - 1s - loss: 0.0477 - acc: 0.9890 - val_loss: 0.0466 - val_acc: 0.9890
Epoch 24/200
 - 1s - loss: 0.0491 - acc: 0.9881 - val_loss: 0.0465 - val_acc: 0.9890
Epoch 25/200
 - 1s - loss: 0.0471 - acc: 0.9890 - val_loss: 0.0463 - val_acc: 0.9892
Epoch 26/200
 - 1s - loss: 0.0483 - acc: 0.9886 - val_loss: 0.0461 - val_acc: 0.9894
Epoch 27/200
 - 1s - loss: 0.0479 - acc: 0.9885 - val_loss: 0.0460 - val_acc: 0.9894
Epoch 28/200
 - 1s - loss: 0.0474 - acc: 0.9896 - val_loss: 0.0458 - val_acc: 0.9894
Epoch 29/200
 - 1s - loss: 0.0472 - acc: 0.9877 - val_loss: 0.0457 - val_acc: 0.9896
Epoch 30/200
 - 1s - loss: 0.0475 - acc: 0.9881 - val_loss: 0.0455 - val_acc: 0.9898
Epoch 31/200
 - 1s - loss: 0.0479 - acc: 0.9877 - val_loss: 0.0454 - val_acc: 0.9898
Epoch 32/200
 - 1s - loss: 0.0462 - acc: 0.9888 - val_loss: 0.0453 - val_acc: 0.9900
Epoch 33/200
 - 1s - loss: 0.0457 - acc: 0.9893 - val_loss: 0.0452 - val_acc: 0.9900
Epoch 34/200
 - 1s - loss: 0.0462 - acc: 0.9887 - val_loss: 0.0451 - val_acc: 0.9900
Epoch 35/200
 - 1s - loss: 0.0465 - acc: 0.9878 - val_loss: 0.0450 - val_acc: 0.9900
Epoch 36/200
 - 1s - loss: 0.0457 - acc: 0.9880 - val_loss: 0.0448 - val_acc: 0.9900
Epoch 37/200
 - 1s - loss: 0.0460 - acc: 0.9884 - val_loss: 0.0447 - val_acc: 0.9900
Epoch 38/200
 - 1s - loss: 0.0457 - acc: 0.9883 - val_loss: 0.0446 - val_acc: 0.9900
Epoch 39/200
 - 1s - loss: 0.0459 - acc: 0.9887 - val_loss: 0.0445 - val_acc: 0.9900
Epoch 40/200
 - 1s - loss: 0.0456 - acc: 0.9888 - val_loss: 0.0445 - val_acc: 0.9900
Epoch 41/200
 - 1s - loss: 0.0454 - acc: 0.9896 - val_loss: 0.0444 - val_acc: 0.9902
Epoch 42/200
 - 1s - loss: 0.0452 - acc: 0.9886 - val_loss: 0.0443 - val_acc: 0.9900
Epoch 43/200
 - 1s - loss: 0.0462 - acc: 0.9884 - val_loss: 0.0442 - val_acc: 0.9902
Epoch 44/200
 - 1s - loss: 0.0452 - acc: 0.9892 - val_loss: 0.0441 - val_acc: 0.9902
Epoch 45/200
 - 1s - loss: 0.0449 - acc: 0.9889 - val_loss: 0.0440 - val_acc: 0.9902
Epoch 46/200
 - 1s - loss: 0.0447 - acc: 0.9887 - val_loss: 0.0440 - val_acc: 0.9904
Epoch 47/200
 - 1s - loss: 0.0443 - acc: 0.9896 - val_loss: 0.0439 - val_acc: 0.9904
Epoch 48/200
 - 1s - loss: 0.0450 - acc: 0.9884 - val_loss: 0.0438 - val_acc: 0.9904
Epoch 49/200
 - 1s - loss: 0.0459 - acc: 0.9882 - val_loss: 0.0437 - val_acc: 0.9904
Epoch 50/200
 - 1s - loss: 0.0453 - acc: 0.9890 - val_loss: 0.0437 - val_acc: 0.9904
Epoch 51/200
 - 1s - loss: 0.0441 - acc: 0.9896 - val_loss: 0.0436 - val_acc: 0.9904
Epoch 52/200
 - 1s - loss: 0.0445 - acc: 0.9888 - val_loss: 0.0435 - val_acc: 0.9904
Epoch 53/200
 - 1s - loss: 0.0451 - acc: 0.9879 - val_loss: 0.0435 - val_acc: 0.9904
Epoch 54/200
 - 1s - loss: 0.0447 - acc: 0.9895 - val_loss: 0.0434 - val_acc: 0.9904
Epoch 55/200
 - 1s - loss: 0.0444 - acc: 0.9891 - val_loss: 0.0434 - val_acc: 0.9904
Epoch 56/200
 - 1s - loss: 0.0441 - acc: 0.9895 - val_loss: 0.0433 - val_acc: 0.9904
Epoch 57/200
 - 1s - loss: 0.0447 - acc: 0.9884 - val_loss: 0.0432 - val_acc: 0.9904
Epoch 58/200
 - 1s - loss: 0.0442 - acc: 0.9892 - val_loss: 0.0432 - val_acc: 0.9904
Epoch 59/200
 - 1s - loss: 0.0447 - acc: 0.9892 - val_loss: 0.0431 - val_acc: 0.9904
Epoch 60/200
 - 1s - loss: 0.0444 - acc: 0.9892 - val_loss: 0.0431 - val_acc: 0.9904
Epoch 61/200
 - 1s - loss: 0.0440 - acc: 0.9895 - val_loss: 0.0430 - val_acc: 0.9904
Epoch 62/200
 - 1s - loss: 0.0434 - acc: 0.9896 - val_loss: 0.0430 - val_acc: 0.9904
Epoch 63/200
 - 1s - loss: 0.0438 - acc: 0.9896 - val_loss: 0.0429 - val_acc: 0.9904
Epoch 64/200
 - 1s - loss: 0.0444 - acc: 0.9891 - val_loss: 0.0429 - val_acc: 0.9904
Epoch 65/200
 - 1s - loss: 0.0445 - acc: 0.9887 - val_loss: 0.0428 - val_acc: 0.9904
Epoch 66/200
 - 1s - loss: 0.0441 - acc: 0.9896 - val_loss: 0.0428 - val_acc: 0.9904
Epoch 67/200
 - 1s - loss: 0.0436 - acc: 0.9896 - val_loss: 0.0427 - val_acc: 0.9904
Epoch 68/200
 - 1s - loss: 0.0441 - acc: 0.9886 - val_loss: 0.0427 - val_acc: 0.9904
Epoch 69/200
 - 1s - loss: 0.0439 - acc: 0.9882 - val_loss: 0.0426 - val_acc: 0.9904
Epoch 70/200
 - 1s - loss: 0.0442 - acc: 0.9889 - val_loss: 0.0426 - val_acc: 0.9904
Epoch 71/200
 - 1s - loss: 0.0433 - acc: 0.9888 - val_loss: 0.0425 - val_acc: 0.9904
Epoch 72/200
 - 1s - loss: 0.0429 - acc: 0.9895 - val_loss: 0.0425 - val_acc: 0.9904
Epoch 73/200
 - 1s - loss: 0.0434 - acc: 0.9888 - val_loss: 0.0425 - val_acc: 0.9904
Epoch 74/200
 - 1s - loss: 0.0441 - acc: 0.9888 - val_loss: 0.0424 - val_acc: 0.9904
Epoch 75/200
 - 1s - loss: 0.0431 - acc: 0.9895 - val_loss: 0.0424 - val_acc: 0.9904
Epoch 76/200
 - 1s - loss: 0.0426 - acc: 0.9893 - val_loss: 0.0423 - val_acc: 0.9904
Epoch 77/200
 - 1s - loss: 0.0434 - acc: 0.9888 - val_loss: 0.0423 - val_acc: 0.9904
Epoch 78/200
 - 1s - loss: 0.0428 - acc: 0.9895 - val_loss: 0.0423 - val_acc: 0.9904
Epoch 79/200
 - 1s - loss: 0.0422 - acc: 0.9903 - val_loss: 0.0422 - val_acc: 0.9904
Epoch 80/200
 - 1s - loss: 0.0432 - acc: 0.9893 - val_loss: 0.0422 - val_acc: 0.9904
Epoch 81/200
 - 1s - loss: 0.0435 - acc: 0.9887 - val_loss: 0.0421 - val_acc: 0.9904
Epoch 82/200
 - 1s - loss: 0.0427 - acc: 0.9897 - val_loss: 0.0421 - val_acc: 0.9904
Epoch 83/200
 - 1s - loss: 0.0430 - acc: 0.9892 - val_loss: 0.0421 - val_acc: 0.9904
Epoch 84/200
 - 1s - loss: 0.0436 - acc: 0.9892 - val_loss: 0.0420 - val_acc: 0.9904
Epoch 85/200
 - 1s - loss: 0.0424 - acc: 0.9897 - val_loss: 0.0420 - val_acc: 0.9904
Epoch 86/200
 - 1s - loss: 0.0431 - acc: 0.9891 - val_loss: 0.0420 - val_acc: 0.9904
Epoch 87/200
 - 1s - loss: 0.0433 - acc: 0.9886 - val_loss: 0.0419 - val_acc: 0.9904
Epoch 88/200
 - 1s - loss: 0.0430 - acc: 0.9894 - val_loss: 0.0419 - val_acc: 0.9904
Epoch 89/200
 - 1s - loss: 0.0426 - acc: 0.9894 - val_loss: 0.0419 - val_acc: 0.9904
Epoch 90/200
 - 1s - loss: 0.0425 - acc: 0.9895 - val_loss: 0.0418 - val_acc: 0.9904
Epoch 91/200
 - 1s - loss: 0.0426 - acc: 0.9890 - val_loss: 0.0418 - val_acc: 0.9904
Epoch 92/200
 - 1s - loss: 0.0426 - acc: 0.9892 - val_loss: 0.0418 - val_acc: 0.9904
Epoch 93/200
 - 1s - loss: 0.0427 - acc: 0.9891 - val_loss: 0.0417 - val_acc: 0.9904
Epoch 94/200
 - 1s - loss: 0.0423 - acc: 0.9892 - val_loss: 0.0417 - val_acc: 0.9904
Epoch 95/200
 - 1s - loss: 0.0425 - acc: 0.9893 - val_loss: 0.0417 - val_acc: 0.9904
Epoch 96/200
 - 1s - loss: 0.0429 - acc: 0.9892 - val_loss: 0.0417 - val_acc: 0.9904
Epoch 97/200
 - 1s - loss: 0.0422 - acc: 0.9896 - val_loss: 0.0416 - val_acc: 0.9904
Epoch 98/200
 - 1s - loss: 0.0423 - acc: 0.9896 - val_loss: 0.0416 - val_acc: 0.9904
Epoch 99/200
 - 1s - loss: 0.0427 - acc: 0.9888 - val_loss: 0.0416 - val_acc: 0.9904
Epoch 100/200
 - 1s - loss: 0.0435 - acc: 0.9889 - val_loss: 0.0415 - val_acc: 0.9904
Epoch 101/200
 - 1s - loss: 0.0423 - acc: 0.9890 - val_loss: 0.0415 - val_acc: 0.9904
Epoch 102/200
 - 1s - loss: 0.0434 - acc: 0.9888 - val_loss: 0.0415 - val_acc: 0.9904
Epoch 103/200
 - 1s - loss: 0.0426 - acc: 0.9889 - val_loss: 0.0415 - val_acc: 0.9904
Epoch 104/200
 - 1s - loss: 0.0424 - acc: 0.9895 - val_loss: 0.0414 - val_acc: 0.9904
Epoch 105/200
 - 1s - loss: 0.0424 - acc: 0.9895 - val_loss: 0.0414 - val_acc: 0.9904
Epoch 106/200
 - 1s - loss: 0.0423 - acc: 0.9900 - val_loss: 0.0414 - val_acc: 0.9904
Epoch 107/200
 - 1s - loss: 0.0422 - acc: 0.9898 - val_loss: 0.0413 - val_acc: 0.9904
Epoch 108/200
 - 1s - loss: 0.0424 - acc: 0.9896 - val_loss: 0.0413 - val_acc: 0.9904
Epoch 109/200
 - 1s - loss: 0.0425 - acc: 0.9893 - val_loss: 0.0413 - val_acc: 0.9904
Epoch 110/200
 - 1s - loss: 0.0422 - acc: 0.9890 - val_loss: 0.0413 - val_acc: 0.9904
Epoch 111/200
 - 1s - loss: 0.0424 - acc: 0.9894 - val_loss: 0.0412 - val_acc: 0.9904
Epoch 112/200
 - 1s - loss: 0.0416 - acc: 0.9896 - val_loss: 0.0412 - val_acc: 0.9904
Epoch 113/200
 - 1s - loss: 0.0428 - acc: 0.9892 - val_loss: 0.0412 - val_acc: 0.9904
Epoch 114/200
 - 1s - loss: 0.0421 - acc: 0.9898 - val_loss: 0.0412 - val_acc: 0.9904
Epoch 115/200
 - 1s - loss: 0.0427 - acc: 0.9888 - val_loss: 0.0412 - val_acc: 0.9904
Epoch 116/200
 - 1s - loss: 0.0423 - acc: 0.9891 - val_loss: 0.0411 - val_acc: 0.9904
Epoch 117/200
 - 1s - loss: 0.0422 - acc: 0.9895 - val_loss: 0.0411 - val_acc: 0.9904
Epoch 118/200
 - 1s - loss: 0.0413 - acc: 0.9897 - val_loss: 0.0411 - val_acc: 0.9906
Epoch 119/200
 - 1s - loss: 0.0419 - acc: 0.9886 - val_loss: 0.0411 - val_acc: 0.9906
Epoch 120/200
 - 1s - loss: 0.0414 - acc: 0.9895 - val_loss: 0.0410 - val_acc: 0.9906
Epoch 121/200
 - 1s - loss: 0.0418 - acc: 0.9891 - val_loss: 0.0410 - val_acc: 0.9906
Epoch 122/200
 - 1s - loss: 0.0424 - acc: 0.9899 - val_loss: 0.0410 - val_acc: 0.9906
Epoch 123/200
 - 1s - loss: 0.0419 - acc: 0.9891 - val_loss: 0.0410 - val_acc: 0.9906
Epoch 124/200
 - 1s - loss: 0.0413 - acc: 0.9897 - val_loss: 0.0409 - val_acc: 0.9906
Epoch 125/200
 - 1s - loss: 0.0418 - acc: 0.9895 - val_loss: 0.0409 - val_acc: 0.9906
Epoch 126/200
 - 1s - loss: 0.0418 - acc: 0.9896 - val_loss: 0.0409 - val_acc: 0.9906
Epoch 127/200
 - 1s - loss: 0.0412 - acc: 0.9894 - val_loss: 0.0409 - val_acc: 0.9906
Epoch 128/200
 - 1s - loss: 0.0417 - acc: 0.9896 - val_loss: 0.0409 - val_acc: 0.9906
Epoch 129/200
 - 1s - loss: 0.0422 - acc: 0.9890 - val_loss: 0.0408 - val_acc: 0.9906
Epoch 130/200
 - 1s - loss: 0.0411 - acc: 0.9900 - val_loss: 0.0408 - val_acc: 0.9906
Epoch 131/200
 - 1s - loss: 0.0420 - acc: 0.9893 - val_loss: 0.0408 - val_acc: 0.9906
Epoch 132/200
 - 1s - loss: 0.0410 - acc: 0.9897 - val_loss: 0.0408 - val_acc: 0.9906
Epoch 133/200
 - 1s - loss: 0.0420 - acc: 0.9897 - val_loss: 0.0408 - val_acc: 0.9904
Epoch 134/200
 - 1s - loss: 0.0419 - acc: 0.9895 - val_loss: 0.0407 - val_acc: 0.9904
Epoch 135/200
 - 1s - loss: 0.0421 - acc: 0.9892 - val_loss: 0.0407 - val_acc: 0.9904
Epoch 136/200
 - 1s - loss: 0.0420 - acc: 0.9896 - val_loss: 0.0407 - val_acc: 0.9904
Epoch 137/200
 - 1s - loss: 0.0409 - acc: 0.9898 - val_loss: 0.0407 - val_acc: 0.9904
Epoch 138/200
 - 1s - loss: 0.0418 - acc: 0.9899 - val_loss: 0.0407 - val_acc: 0.9904
Epoch 139/200
 - 1s - loss: 0.0419 - acc: 0.9892 - val_loss: 0.0406 - val_acc: 0.9904
Epoch 140/200
 - 1s - loss: 0.0414 - acc: 0.9892 - val_loss: 0.0406 - val_acc: 0.9904
Epoch 141/200
 - 1s - loss: 0.0422 - acc: 0.9896 - val_loss: 0.0406 - val_acc: 0.9904
Epoch 142/200
 - 1s - loss: 0.0416 - acc: 0.9888 - val_loss: 0.0406 - val_acc: 0.9904
Epoch 143/200
 - 1s - loss: 0.0409 - acc: 0.9900 - val_loss: 0.0406 - val_acc: 0.9904
Epoch 144/200
 - 1s - loss: 0.0415 - acc: 0.9900 - val_loss: 0.0406 - val_acc: 0.9904
Epoch 145/200
 - 1s - loss: 0.0417 - acc: 0.9892 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 146/200
 - 1s - loss: 0.0408 - acc: 0.9899 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 147/200
 - 1s - loss: 0.0410 - acc: 0.9899 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 148/200
 - 1s - loss: 0.0416 - acc: 0.9896 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 149/200
 - 1s - loss: 0.0417 - acc: 0.9891 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 150/200
 - 1s - loss: 0.0410 - acc: 0.9909 - val_loss: 0.0405 - val_acc: 0.9904
Epoch 151/200
 - 1s - loss: 0.0423 - acc: 0.9896 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 152/200
 - 1s - loss: 0.0412 - acc: 0.9896 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 153/200
 - 1s - loss: 0.0410 - acc: 0.9897 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 154/200
 - 1s - loss: 0.0412 - acc: 0.9898 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 155/200
 - 1s - loss: 0.0418 - acc: 0.9895 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 156/200
 - 1s - loss: 0.0407 - acc: 0.9895 - val_loss: 0.0404 - val_acc: 0.9904
Epoch 157/200
 - 1s - loss: 0.0405 - acc: 0.9900 - val_loss: 0.0403 - val_acc: 0.9904
Epoch 158/200
 - 1s - loss: 0.0405 - acc: 0.9896 - val_loss: 0.0403 - val_acc: 0.9904
Epoch 159/200
 - 1s - loss: 0.0409 - acc: 0.9899 - val_loss: 0.0403 - val_acc: 0.9904
Epoch 160/200
 - 1s - loss: 0.0405 - acc: 0.9901 - val_loss: 0.0403 - val_acc: 0.9904
Epoch 161/200
 - 1s - loss: 0.0413 - acc: 0.9897 - val_loss: 0.0403 - val_acc: 0.9904
Epoch 162/200
 - 1s - loss: 0.0409 - acc: 0.9896 - val_loss: 0.0403 - val_acc: 0.9904
Epoch 163/200
 - 1s - loss: 0.0413 - acc: 0.9895 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 164/200
 - 1s - loss: 0.0411 - acc: 0.9891 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 165/200
 - 1s - loss: 0.0413 - acc: 0.9894 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 166/200
 - 1s - loss: 0.0410 - acc: 0.9888 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 167/200
 - 1s - loss: 0.0404 - acc: 0.9900 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 168/200
 - 1s - loss: 0.0409 - acc: 0.9892 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 169/200
 - 1s - loss: 0.0406 - acc: 0.9893 - val_loss: 0.0402 - val_acc: 0.9904
Epoch 170/200
 - 1s - loss: 0.0407 - acc: 0.9896 - val_loss: 0.0401 - val_acc: 0.9904
Epoch 171/200
 - 1s - loss: 0.0409 - acc: 0.9895 - val_loss: 0.0401 - val_acc: 0.9904
Epoch 172/200
 - 1s - loss: 0.0408 - acc: 0.9905 - val_loss: 0.0401 - val_acc: 0.9904
Epoch 173/200
 - 1s - loss: 0.0404 - acc: 0.9900 - val_loss: 0.0401 - val_acc: 0.9904
Epoch 174/200
 - 1s - loss: 0.0410 - acc: 0.9902 - val_loss: 0.0401 - val_acc: 0.9904
Epoch 175/200
 - 1s - loss: 0.0408 - acc: 0.9900 - val_loss: 0.0401 - val_acc: 0.9904
Epoch 176/200
 - 1s - loss: 0.0410 - acc: 0.9891 - val_loss: 0.0401 - val_acc: 0.9906
Epoch 177/200
 - 1s - loss: 0.0418 - acc: 0.9892 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 178/200
 - 1s - loss: 0.0406 - acc: 0.9896 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 179/200
 - 1s - loss: 0.0411 - acc: 0.9892 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 180/200
 - 1s - loss: 0.0413 - acc: 0.9897 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 181/200
 - 1s - loss: 0.0406 - acc: 0.9903 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 182/200
 - 1s - loss: 0.0415 - acc: 0.9899 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 183/200
 - 1s - loss: 0.0412 - acc: 0.9888 - val_loss: 0.0400 - val_acc: 0.9906
Epoch 184/200
 - 1s - loss: 0.0410 - acc: 0.9892 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 185/200
 - 1s - loss: 0.0406 - acc: 0.9904 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 186/200
 - 1s - loss: 0.0408 - acc: 0.9899 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 187/200
 - 1s - loss: 0.0400 - acc: 0.9895 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 188/200
 - 1s - loss: 0.0402 - acc: 0.9899 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 189/200
 - 1s - loss: 0.0405 - acc: 0.9899 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 190/200
 - 1s - loss: 0.0405 - acc: 0.9891 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 191/200
 - 1s - loss: 0.0410 - acc: 0.9897 - val_loss: 0.0399 - val_acc: 0.9906
Epoch 192/200
 - 1s - loss: 0.0406 - acc: 0.9902 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 193/200
 - 1s - loss: 0.0410 - acc: 0.9898 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 194/200
 - 1s - loss: 0.0398 - acc: 0.9903 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 195/200
 - 1s - loss: 0.0408 - acc: 0.9892 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 196/200
 - 1s - loss: 0.0404 - acc: 0.9899 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 197/200
 - 1s - loss: 0.0408 - acc: 0.9897 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 198/200
 - 1s - loss: 0.0406 - acc: 0.9897 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 199/200
 - 1s - loss: 0.0405 - acc: 0.9897 - val_loss: 0.0398 - val_acc: 0.9906
Epoch 200/200
 - 1s - loss: 0.0403 - acc: 0.9904 - val_loss: 0.0397 - val_acc: 0.9906
2018-03-27 11:25:52,321 [INFO] Evaluate...
2018-03-27 11:25:55,926 [INFO] Done!
2018-03-27 11:25:55,933 [INFO] tpe_transform took 0.002443 seconds
2018-03-27 11:25:55,934 [INFO] TPE using 54/54 trials with best loss 0.011121
2018-03-27 11:25:55,941 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:25:56,925 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0642 - acc: 0.9756 - val_loss: 0.0310 - val_acc: 0.9912
Epoch 2/200
 - 1s - loss: 0.0340 - acc: 0.9894 - val_loss: 0.0281 - val_acc: 0.9918
Epoch 3/200
 - 1s - loss: 0.0321 - acc: 0.9898 - val_loss: 0.0267 - val_acc: 0.9922
Epoch 4/200
 - 1s - loss: 0.0299 - acc: 0.9912 - val_loss: 0.0260 - val_acc: 0.9926
Epoch 5/200
 - 1s - loss: 0.0294 - acc: 0.9909 - val_loss: 0.0254 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0282 - acc: 0.9913 - val_loss: 0.0250 - val_acc: 0.9932
Epoch 7/200
 - 1s - loss: 0.0278 - acc: 0.9919 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0282 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9926
Epoch 9/200
 - 1s - loss: 0.0275 - acc: 0.9908 - val_loss: 0.0243 - val_acc: 0.9926
Epoch 10/200
 - 1s - loss: 0.0275 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0269 - acc: 0.9913 - val_loss: 0.0240 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0271 - acc: 0.9910 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0266 - acc: 0.9913 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 14/200
 - 1s - loss: 0.0275 - acc: 0.9908 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 15/200
 - 1s - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 16/200
 - 1s - loss: 0.0260 - acc: 0.9917 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 17/200
 - 1s - loss: 0.0261 - acc: 0.9923 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 18/200
 - 1s - loss: 0.0265 - acc: 0.9917 - val_loss: 0.0233 - val_acc: 0.9930
Epoch 19/200
 - 1s - loss: 0.0270 - acc: 0.9914 - val_loss: 0.0232 - val_acc: 0.9930
Epoch 20/200
 - 1s - loss: 0.0261 - acc: 0.9915 - val_loss: 0.0232 - val_acc: 0.9930
Epoch 21/200
 - 1s - loss: 0.0250 - acc: 0.9922 - val_loss: 0.0231 - val_acc: 0.9930
Epoch 22/200
 - 1s - loss: 0.0264 - acc: 0.9910 - val_loss: 0.0231 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0251 - acc: 0.9922 - val_loss: 0.0230 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0253 - acc: 0.9920 - val_loss: 0.0230 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0229 - val_acc: 0.9930
Epoch 26/200
 - 1s - loss: 0.0250 - acc: 0.9920 - val_loss: 0.0229 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0250 - acc: 0.9922 - val_loss: 0.0228 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0248 - acc: 0.9922 - val_loss: 0.0228 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0253 - acc: 0.9920 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0251 - acc: 0.9923 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0251 - acc: 0.9914 - val_loss: 0.0227 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0239 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9930
Epoch 33/200
 - 1s - loss: 0.0239 - acc: 0.9919 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 34/200
 - 1s - loss: 0.0246 - acc: 0.9924 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 35/200
 - 1s - loss: 0.0241 - acc: 0.9928 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 36/200
 - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 37/200
 - 1s - loss: 0.0237 - acc: 0.9922 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0248 - acc: 0.9923 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0230 - acc: 0.9922 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0224 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0248 - acc: 0.9925 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0244 - acc: 0.9924 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0249 - acc: 0.9922 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 46/200
 - 1s - loss: 0.0242 - acc: 0.9919 - val_loss: 0.0223 - val_acc: 0.9932
Epoch 47/200
 - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 48/200
 - 1s - loss: 0.0240 - acc: 0.9928 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 49/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 50/200
 - 1s - loss: 0.0242 - acc: 0.9923 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 51/200
 - 1s - loss: 0.0243 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 52/200
 - 1s - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0222 - val_acc: 0.9932
Epoch 53/200
 - 1s - loss: 0.0243 - acc: 0.9923 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 54/200
 - 1s - loss: 0.0251 - acc: 0.9922 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 55/200
 - 1s - loss: 0.0245 - acc: 0.9925 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 56/200
 - 1s - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0221 - val_acc: 0.9932
Epoch 57/200
 - 1s - loss: 0.0255 - acc: 0.9918 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 58/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 59/200
 - 1s - loss: 0.0251 - acc: 0.9919 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 60/200
 - 1s - loss: 0.0243 - acc: 0.9924 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 61/200
 - 1s - loss: 0.0239 - acc: 0.9925 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 62/200
 - 1s - loss: 0.0248 - acc: 0.9918 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 63/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 64/200
 - 1s - loss: 0.0234 - acc: 0.9930 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 65/200
 - 1s - loss: 0.0248 - acc: 0.9924 - val_loss: 0.0219 - val_acc: 0.9932
Epoch 66/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0219 - val_acc: 0.9932
Epoch 67/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0219 - val_acc: 0.9932
Epoch 68/200
 - 1s - loss: 0.0249 - acc: 0.9914 - val_loss: 0.0219 - val_acc: 0.9932
Epoch 69/200
 - 1s - loss: 0.0232 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9932
Epoch 70/200
 - 1s - loss: 0.0240 - acc: 0.9925 - val_loss: 0.0219 - val_acc: 0.9932
Epoch 71/200
 - 1s - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 72/200
 - 1s - loss: 0.0248 - acc: 0.9922 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 73/200
 - 1s - loss: 0.0237 - acc: 0.9922 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 74/200
 - 1s - loss: 0.0235 - acc: 0.9923 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 75/200
 - 1s - loss: 0.0245 - acc: 0.9924 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 76/200
 - 1s - loss: 0.0232 - acc: 0.9926 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 77/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 78/200
 - 1s - loss: 0.0232 - acc: 0.9927 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 79/200
 - 1s - loss: 0.0239 - acc: 0.9924 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 80/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9932
Epoch 81/200
 - 1s - loss: 0.0249 - acc: 0.9920 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 82/200
 - 1s - loss: 0.0229 - acc: 0.9923 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 83/200
 - 1s - loss: 0.0238 - acc: 0.9923 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 84/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 85/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 86/200
 - 1s - loss: 0.0230 - acc: 0.9930 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 87/200
 - 1s - loss: 0.0240 - acc: 0.9923 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 88/200
 - 1s - loss: 0.0236 - acc: 0.9928 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 89/200
 - 1s - loss: 0.0234 - acc: 0.9930 - val_loss: 0.0217 - val_acc: 0.9932
Epoch 90/200
 - 1s - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 91/200
 - 1s - loss: 0.0236 - acc: 0.9926 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 92/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 93/200
 - 1s - loss: 0.0234 - acc: 0.9923 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 94/200
 - 1s - loss: 0.0237 - acc: 0.9925 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 95/200
 - 1s - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 96/200
 - 1s - loss: 0.0231 - acc: 0.9926 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 97/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 98/200
 - 1s - loss: 0.0230 - acc: 0.9927 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 99/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 100/200
 - 1s - loss: 0.0234 - acc: 0.9928 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 101/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0216 - val_acc: 0.9932
Epoch 102/200
 - 1s - loss: 0.0236 - acc: 0.9923 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 103/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 104/200
 - 1s - loss: 0.0236 - acc: 0.9930 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 105/200
 - 1s - loss: 0.0237 - acc: 0.9924 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 106/200
 - 1s - loss: 0.0230 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 107/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 108/200
 - 1s - loss: 0.0238 - acc: 0.9925 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 109/200
 - 1s - loss: 0.0235 - acc: 0.9922 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 110/200
 - 1s - loss: 0.0230 - acc: 0.9925 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 111/200
 - 1s - loss: 0.0227 - acc: 0.9930 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 112/200
 - 1s - loss: 0.0230 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 113/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 114/200
 - 1s - loss: 0.0229 - acc: 0.9924 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 115/200
 - 1s - loss: 0.0240 - acc: 0.9928 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 116/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 117/200
 - 1s - loss: 0.0224 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 118/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 119/200
 - 1s - loss: 0.0232 - acc: 0.9924 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 120/200
 - 1s - loss: 0.0237 - acc: 0.9917 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 121/200
 - 1s - loss: 0.0236 - acc: 0.9919 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 122/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 123/200
 - 1s - loss: 0.0231 - acc: 0.9925 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 124/200
 - 1s - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 125/200
 - 1s - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 126/200
 - 1s - loss: 0.0237 - acc: 0.9920 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 127/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 128/200
 - 1s - loss: 0.0239 - acc: 0.9923 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 129/200
 - 1s - loss: 0.0233 - acc: 0.9926 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 130/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 131/200
 - 1s - loss: 0.0241 - acc: 0.9922 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 132/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 133/200
 - 1s - loss: 0.0233 - acc: 0.9920 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 134/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0214 - val_acc: 0.9932
Epoch 135/200
 - 1s - loss: 0.0236 - acc: 0.9920 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 136/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 137/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 138/200
 - 1s - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 139/200
 - 1s - loss: 0.0232 - acc: 0.9927 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0241 - acc: 0.9922 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 142/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 143/200
 - 1s - loss: 0.0231 - acc: 0.9922 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 144/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 145/200
 - 1s - loss: 0.0234 - acc: 0.9920 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0237 - acc: 0.9923 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0230 - acc: 0.9927 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0230 - acc: 0.9930 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0239 - acc: 0.9923 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0235 - acc: 0.9923 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0224 - acc: 0.9930 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0229 - acc: 0.9925 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0226 - acc: 0.9927 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 163/200
 - 1s - loss: 0.0238 - acc: 0.9922 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0227 - acc: 0.9922 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0233 - acc: 0.9923 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0232 - acc: 0.9924 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0226 - acc: 0.9928 - val_loss: 0.0212 - val_acc: 0.9932
Epoch 175/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 176/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 177/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 178/200
 - 1s - loss: 0.0230 - acc: 0.9921 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 179/200
 - 1s - loss: 0.0222 - acc: 0.9930 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 180/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 181/200
 - 1s - loss: 0.0227 - acc: 0.9922 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 182/200
 - 1s - loss: 0.0234 - acc: 0.9927 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 183/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 184/200
 - 1s - loss: 0.0224 - acc: 0.9926 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 185/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 186/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 187/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 188/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 189/200
 - 1s - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 190/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 191/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 192/200
 - 1s - loss: 0.0235 - acc: 0.9923 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 193/200
 - 1s - loss: 0.0225 - acc: 0.9927 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 194/200
 - 1s - loss: 0.0227 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 195/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 196/200
 - 1s - loss: 0.0226 - acc: 0.9923 - val_loss: 0.0211 - val_acc: 0.9932
Epoch 197/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 198/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 199/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 200/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0211 - val_acc: 0.9934
2018-03-27 11:29:10,010 [INFO] Evaluate...
2018-03-27 11:29:13,690 [INFO] Done!
2018-03-27 11:29:13,697 [INFO] tpe_transform took 0.002483 seconds
2018-03-27 11:29:13,698 [INFO] TPE using 55/55 trials with best loss 0.011121
2018-03-27 11:29:13,705 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:29:14,696 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0467 - acc: 0.9805 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 2/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0148 - val_acc: 0.9954
Epoch 3/200
 - 1s - loss: 0.0185 - acc: 0.9936 - val_loss: 0.0142 - val_acc: 0.9954
Epoch 4/200
 - 1s - loss: 0.0173 - acc: 0.9942 - val_loss: 0.0139 - val_acc: 0.9954
Epoch 5/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0137 - val_acc: 0.9956
Epoch 6/200
 - 1s - loss: 0.0165 - acc: 0.9945 - val_loss: 0.0136 - val_acc: 0.9954
Epoch 7/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0136 - val_acc: 0.9950
Epoch 8/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0135 - val_acc: 0.9954
Epoch 9/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0136 - val_acc: 0.9960
Epoch 10/200
 - 1s - loss: 0.0153 - acc: 0.9950 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 11/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0134 - val_acc: 0.9952
Epoch 12/200
 - 1s - loss: 0.0146 - acc: 0.9951 - val_loss: 0.0134 - val_acc: 0.9956
Epoch 13/200
 - 1s - loss: 0.0145 - acc: 0.9952 - val_loss: 0.0134 - val_acc: 0.9958
Epoch 14/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 15/200
 - 1s - loss: 0.0145 - acc: 0.9957 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 16/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 17/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 18/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 19/200
 - 1s - loss: 0.0138 - acc: 0.9961 - val_loss: 0.0133 - val_acc: 0.9954
Epoch 20/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9956
Epoch 21/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 22/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0132 - val_acc: 0.9956
Epoch 23/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0132 - val_acc: 0.9956
Epoch 24/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 25/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0132 - val_acc: 0.9954
Epoch 26/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0132 - val_acc: 0.9956
Epoch 27/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0132 - val_acc: 0.9956
Epoch 28/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 29/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 30/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 31/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 32/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 33/200
 - 1s - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 34/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 35/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 36/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 37/200
 - 1s - loss: 0.0128 - acc: 0.9964 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 38/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 39/200
 - 1s - loss: 0.0130 - acc: 0.9964 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 40/200
 - 1s - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 41/200
 - 1s - loss: 0.0131 - acc: 0.9962 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 42/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 43/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 44/200
 - 1s - loss: 0.0128 - acc: 0.9964 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 45/200
 - 1s - loss: 0.0129 - acc: 0.9962 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 46/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 47/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 48/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 49/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 50/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 51/200
 - 1s - loss: 0.0127 - acc: 0.9961 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 52/200
 - 1s - loss: 0.0126 - acc: 0.9964 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 53/200
 - 1s - loss: 0.0125 - acc: 0.9965 - val_loss: 0.0131 - val_acc: 0.9952
Epoch 54/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9950
Epoch 55/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 56/200
 - 1s - loss: 0.0126 - acc: 0.9964 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 57/200
 - 1s - loss: 0.0123 - acc: 0.9965 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 58/200
 - 1s - loss: 0.0125 - acc: 0.9964 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 59/200
 - 1s - loss: 0.0124 - acc: 0.9965 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 60/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0131 - val_acc: 0.9954
Epoch 61/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 62/200
 - 1s - loss: 0.0118 - acc: 0.9969 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 63/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 64/200
 - 1s - loss: 0.0121 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 65/200
 - 1s - loss: 0.0125 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 66/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 67/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 68/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 69/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 70/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 71/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 72/200
 - 1s - loss: 0.0123 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 73/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 74/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 75/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 76/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 77/200
 - 1s - loss: 0.0118 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 78/200
 - 1s - loss: 0.0122 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 79/200
 - 1s - loss: 0.0124 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 80/200
 - 1s - loss: 0.0122 - acc: 0.9962 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 81/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 82/200
 - 1s - loss: 0.0122 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 83/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 84/200
 - 1s - loss: 0.0120 - acc: 0.9971 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 85/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 86/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 87/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 88/200
 - 1s - loss: 0.0120 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 89/200
 - 1s - loss: 0.0119 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 90/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 91/200
 - 1s - loss: 0.0120 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 92/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 93/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 94/200
 - 1s - loss: 0.0119 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 95/200
 - 1s - loss: 0.0119 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 96/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 97/200
 - 1s - loss: 0.0118 - acc: 0.9969 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 98/200
 - 1s - loss: 0.0120 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 99/200
 - 1s - loss: 0.0117 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 100/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 101/200
 - 1s - loss: 0.0119 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 102/200
 - 1s - loss: 0.0120 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 103/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 104/200
 - 1s - loss: 0.0115 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 105/200
 - 1s - loss: 0.0119 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 106/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 107/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 108/200
 - 1s - loss: 0.0117 - acc: 0.9970 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 109/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 110/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 111/200
 - 1s - loss: 0.0120 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 112/200
 - 1s - loss: 0.0117 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 113/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 114/200
 - 1s - loss: 0.0115 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 115/200
 - 1s - loss: 0.0114 - acc: 0.9972 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 116/200
 - 1s - loss: 0.0116 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 117/200
 - 1s - loss: 0.0118 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 118/200
 - 1s - loss: 0.0120 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 119/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 120/200
 - 1s - loss: 0.0116 - acc: 0.9970 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 121/200
 - 1s - loss: 0.0116 - acc: 0.9969 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 122/200
 - 1s - loss: 0.0117 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 123/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 124/200
 - 1s - loss: 0.0119 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 125/200
 - 1s - loss: 0.0115 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 126/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 127/200
 - 1s - loss: 0.0115 - acc: 0.9970 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 128/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 129/200
 - 1s - loss: 0.0117 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 130/200
 - 1s - loss: 0.0110 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 131/200
 - 1s - loss: 0.0115 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 132/200
 - 1s - loss: 0.0117 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 133/200
 - 1s - loss: 0.0113 - acc: 0.9969 - val_loss: 0.0130 - val_acc: 0.9952
Epoch 134/200
 - 1s - loss: 0.0117 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 135/200
 - 1s - loss: 0.0113 - acc: 0.9967 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 136/200
 - 1s - loss: 0.0114 - acc: 0.9966 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 137/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 138/200
 - 1s - loss: 0.0115 - acc: 0.9970 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 139/200
 - 1s - loss: 0.0116 - acc: 0.9970 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 140/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0130 - val_acc: 0.9954
Epoch 141/200
 - 1s - loss: 0.0119 - acc: 0.9968 - val_loss: 0.0130 - val_acc: 0.9954
2018-03-27 11:31:34,185 [INFO] Evaluate...
2018-03-27 11:31:37,920 [INFO] Done!
2018-03-27 11:31:37,927 [INFO] tpe_transform took 0.003267 seconds
2018-03-27 11:31:37,930 [INFO] TPE using 56/56 trials with best loss 0.011121
2018-03-27 11:31:37,936 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:31:38,921 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0731 - acc: 0.9765 - val_loss: 0.0343 - val_acc: 0.9910
Epoch 2/200
 - 1s - loss: 0.0342 - acc: 0.9904 - val_loss: 0.0304 - val_acc: 0.9910
Epoch 3/200
 - 1s - loss: 0.0318 - acc: 0.9914 - val_loss: 0.0288 - val_acc: 0.9912
Epoch 4/200
 - 1s - loss: 0.0311 - acc: 0.9912 - val_loss: 0.0278 - val_acc: 0.9914
Epoch 5/200
 - 1s - loss: 0.0298 - acc: 0.9917 - val_loss: 0.0272 - val_acc: 0.9918
Epoch 6/200
 - 1s - loss: 0.0292 - acc: 0.9922 - val_loss: 0.0267 - val_acc: 0.9922
Epoch 7/200
 - 1s - loss: 0.0287 - acc: 0.9920 - val_loss: 0.0264 - val_acc: 0.9926
Epoch 8/200
 - 1s - loss: 0.0281 - acc: 0.9922 - val_loss: 0.0260 - val_acc: 0.9924
Epoch 9/200
 - 1s - loss: 0.0277 - acc: 0.9923 - val_loss: 0.0258 - val_acc: 0.9926
Epoch 10/200
 - 1s - loss: 0.0275 - acc: 0.9924 - val_loss: 0.0255 - val_acc: 0.9932
Epoch 11/200
 - 1s - loss: 0.0274 - acc: 0.9918 - val_loss: 0.0254 - val_acc: 0.9932
Epoch 12/200
 - 1s - loss: 0.0269 - acc: 0.9927 - val_loss: 0.0252 - val_acc: 0.9932
Epoch 13/200
 - 1s - loss: 0.0270 - acc: 0.9927 - val_loss: 0.0250 - val_acc: 0.9932
Epoch 14/200
 - 1s - loss: 0.0269 - acc: 0.9929 - val_loss: 0.0249 - val_acc: 0.9934
Epoch 15/200
 - 1s - loss: 0.0264 - acc: 0.9923 - val_loss: 0.0248 - val_acc: 0.9934
Epoch 16/200
 - 1s - loss: 0.0262 - acc: 0.9925 - val_loss: 0.0247 - val_acc: 0.9934
Epoch 17/200
 - 1s - loss: 0.0264 - acc: 0.9926 - val_loss: 0.0246 - val_acc: 0.9936
Epoch 18/200
 - 1s - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0245 - val_acc: 0.9934
Epoch 19/200
 - 1s - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0244 - val_acc: 0.9934
Epoch 20/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0243 - val_acc: 0.9934
Epoch 21/200
 - 1s - loss: 0.0258 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9934
Epoch 22/200
 - 1s - loss: 0.0259 - acc: 0.9924 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 23/200
 - 1s - loss: 0.0253 - acc: 0.9930 - val_loss: 0.0241 - val_acc: 0.9934
Epoch 24/200
 - 1s - loss: 0.0256 - acc: 0.9927 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 25/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0240 - val_acc: 0.9934
Epoch 26/200
 - 1s - loss: 0.0254 - acc: 0.9924 - val_loss: 0.0239 - val_acc: 0.9934
Epoch 27/200
 - 1s - loss: 0.0259 - acc: 0.9926 - val_loss: 0.0238 - val_acc: 0.9934
Epoch 28/200
 - 1s - loss: 0.0251 - acc: 0.9924 - val_loss: 0.0238 - val_acc: 0.9934
Epoch 29/200
 - 1s - loss: 0.0250 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9934
Epoch 30/200
 - 1s - loss: 0.0253 - acc: 0.9927 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 31/200
 - 1s - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0237 - val_acc: 0.9936
Epoch 32/200
 - 1s - loss: 0.0253 - acc: 0.9929 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 33/200
 - 1s - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 34/200
 - 1s - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 35/200
 - 1s - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 36/200
 - 1s - loss: 0.0246 - acc: 0.9927 - val_loss: 0.0235 - val_acc: 0.9936
Epoch 37/200
 - 1s - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 38/200
 - 1s - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 39/200
 - 1s - loss: 0.0245 - acc: 0.9929 - val_loss: 0.0234 - val_acc: 0.9936
Epoch 40/200
 - 1s - loss: 0.0250 - acc: 0.9929 - val_loss: 0.0233 - val_acc: 0.9936
Epoch 41/200
 - 1s - loss: 0.0251 - acc: 0.9927 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 42/200
 - 1s - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0233 - val_acc: 0.9938
Epoch 43/200
 - 1s - loss: 0.0245 - acc: 0.9930 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0250 - acc: 0.9925 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 45/200
 - 1s - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0232 - val_acc: 0.9936
Epoch 46/200
 - 1s - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0232 - val_acc: 0.9938
Epoch 47/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 48/200
 - 1s - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 49/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9936
Epoch 50/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0231 - val_acc: 0.9938
Epoch 51/200
 - 1s - loss: 0.0242 - acc: 0.9934 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 52/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 53/200
 - 1s - loss: 0.0245 - acc: 0.9932 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 54/200
 - 1s - loss: 0.0244 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9938
Epoch 55/200
 - 1s - loss: 0.0243 - acc: 0.9932 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 56/200
 - 1s - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 57/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 58/200
 - 1s - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 59/200
 - 1s - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0229 - val_acc: 0.9938
Epoch 60/200
 - 1s - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 61/200
 - 1s - loss: 0.0243 - acc: 0.9928 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 62/200
 - 1s - loss: 0.0239 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 63/200
 - 1s - loss: 0.0239 - acc: 0.9933 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 64/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 65/200
 - 1s - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0228 - val_acc: 0.9938
Epoch 66/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 67/200
 - 1s - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 68/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 69/200
 - 1s - loss: 0.0238 - acc: 0.9935 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 70/200
 - 1s - loss: 0.0236 - acc: 0.9936 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 71/200
 - 1s - loss: 0.0237 - acc: 0.9930 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 72/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 73/200
 - 1s - loss: 0.0239 - acc: 0.9928 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 74/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 75/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 76/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 77/200
 - 1s - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 78/200
 - 1s - loss: 0.0236 - acc: 0.9930 - val_loss: 0.0226 - val_acc: 0.9938
Epoch 79/200
 - 1s - loss: 0.0239 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 80/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 81/200
 - 1s - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 82/200
 - 1s - loss: 0.0235 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 83/200
 - 1s - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 84/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 85/200
 - 1s - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 86/200
 - 1s - loss: 0.0232 - acc: 0.9930 - val_loss: 0.0225 - val_acc: 0.9938
Epoch 87/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 88/200
 - 1s - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 89/200
 - 1s - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 90/200
 - 1s - loss: 0.0241 - acc: 0.9924 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 91/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 92/200
 - 1s - loss: 0.0233 - acc: 0.9930 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 93/200
 - 1s - loss: 0.0233 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 94/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9938
Epoch 95/200
 - 1s - loss: 0.0234 - acc: 0.9932 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 96/200
 - 1s - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 97/200
 - 1s - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 98/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 99/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 100/200
 - 1s - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 101/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 102/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 103/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 104/200
 - 1s - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 105/200
 - 1s - loss: 0.0235 - acc: 0.9928 - val_loss: 0.0223 - val_acc: 0.9938
Epoch 106/200
 - 1s - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 107/200
 - 1s - loss: 0.0231 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 108/200
 - 1s - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 109/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 110/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 111/200
 - 1s - loss: 0.0233 - acc: 0.9935 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 112/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 113/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 114/200
 - 1s - loss: 0.0230 - acc: 0.9935 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 115/200
 - 1s - loss: 0.0232 - acc: 0.9938 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 116/200
 - 1s - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 117/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0222 - val_acc: 0.9938
Epoch 118/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 119/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 120/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 121/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 122/200
 - 1s - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 123/200
 - 1s - loss: 0.0230 - acc: 0.9935 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 124/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 125/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 126/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 127/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 128/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 129/200
 - 1s - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 130/200
 - 1s - loss: 0.0233 - acc: 0.9934 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 131/200
 - 1s - loss: 0.0234 - acc: 0.9927 - val_loss: 0.0221 - val_acc: 0.9940
Epoch 132/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 133/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 134/200
 - 1s - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 135/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 136/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 137/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 138/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 139/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 140/200
 - 1s - loss: 0.0231 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 141/200
 - 1s - loss: 0.0231 - acc: 0.9939 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 142/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 143/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 144/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 145/200
 - 1s - loss: 0.0231 - acc: 0.9934 - val_loss: 0.0220 - val_acc: 0.9940
Epoch 146/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 147/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 148/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 149/200
 - 1s - loss: 0.0233 - acc: 0.9930 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 150/200
 - 1s - loss: 0.0229 - acc: 0.9939 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 151/200
 - 1s - loss: 0.0231 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 152/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 153/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 154/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 155/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 156/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 157/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 158/200
 - 1s - loss: 0.0229 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 159/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 160/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 161/200
 - 1s - loss: 0.0232 - acc: 0.9936 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 162/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 163/200
 - 1s - loss: 0.0232 - acc: 0.9936 - val_loss: 0.0219 - val_acc: 0.9940
Epoch 164/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 165/200
 - 1s - loss: 0.0227 - acc: 0.9938 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 166/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 167/200
 - 1s - loss: 0.0228 - acc: 0.9934 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 168/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 169/200
 - 1s - loss: 0.0223 - acc: 0.9938 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 170/200
 - 1s - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 171/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 172/200
 - 1s - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 173/200
 - 1s - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 174/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 175/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 176/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 177/200
 - 1s - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 178/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 179/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 180/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 181/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 182/200
 - 1s - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 183/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 184/200
 - 1s - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9940
Epoch 185/200
 - 1s - loss: 0.0224 - acc: 0.9938 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 186/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 187/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 188/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 189/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 190/200
 - 1s - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 191/200
 - 1s - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 192/200
 - 1s - loss: 0.0228 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 193/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 194/200
 - 1s - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 195/200
 - 1s - loss: 0.0225 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 196/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 197/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 198/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 199/200
 - 1s - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0217 - val_acc: 0.9940
Epoch 200/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9940
2018-03-27 11:34:52,708 [INFO] Evaluate...
2018-03-27 11:34:56,491 [INFO] Done!
2018-03-27 11:34:56,498 [INFO] tpe_transform took 0.002456 seconds
2018-03-27 11:34:56,498 [INFO] TPE using 57/57 trials with best loss 0.011121
2018-03-27 11:34:56,507 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:34:57,498 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0413 - acc: 0.9856 - val_loss: 0.0214 - val_acc: 0.9926
Epoch 2/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0197 - val_acc: 0.9928
Epoch 3/200
 - 1s - loss: 0.0205 - acc: 0.9934 - val_loss: 0.0193 - val_acc: 0.9928
Epoch 4/200
 - 1s - loss: 0.0199 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9930
Epoch 5/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0185 - val_acc: 0.9930
Epoch 6/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9932
Epoch 7/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0181 - val_acc: 0.9932
Epoch 8/200
 - 1s - loss: 0.0188 - acc: 0.9933 - val_loss: 0.0180 - val_acc: 0.9934
Epoch 9/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9934
Epoch 10/200
 - 1s - loss: 0.0186 - acc: 0.9939 - val_loss: 0.0178 - val_acc: 0.9934
Epoch 11/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9936
Epoch 12/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9936
Epoch 13/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9936
Epoch 14/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 15/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 16/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 17/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 18/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 19/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 21/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 22/200
 - 1s - loss: 0.0175 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 23/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 25/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 26/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 27/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 29/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 32/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 34/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 36/200
 - 1s - loss: 0.0162 - acc: 0.9948 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 41/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 42/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 43/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 45/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 47/200
 - 1s - loss: 0.0168 - acc: 0.9942 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 48/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 49/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 51/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 52/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 53/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0164 - acc: 0.9944 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 56/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 57/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 58/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 59/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 60/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0167 - val_acc: 0.9942
Epoch 61/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 64/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 67/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 68/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 69/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 70/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 71/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 72/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 73/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9942
Epoch 74/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 75/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 76/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 77/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 78/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 79/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 80/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 81/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 82/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 83/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 84/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 85/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 86/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 87/200
 - 1s - loss: 0.0167 - acc: 0.9945 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 88/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 89/200
 - 1s - loss: 0.0162 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 90/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 91/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 92/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 93/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9942
Epoch 94/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 95/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 96/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 97/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 98/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 99/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 100/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 101/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 102/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 103/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 104/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 105/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 106/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 107/200
 - 1s - loss: 0.0162 - acc: 0.9948 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 108/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 109/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 110/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 111/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 112/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 113/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 114/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 115/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 116/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 117/200
 - 1s - loss: 0.0163 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 118/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 119/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 120/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 121/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 122/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 123/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 124/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 125/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 126/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 127/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 128/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 129/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 130/200
 - 1s - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 131/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 132/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 133/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 134/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 135/200
 - 1s - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 136/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 137/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 138/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 139/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 140/200
 - 1s - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 141/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 142/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 143/200
 - 1s - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 144/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 145/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 146/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9946
Epoch 147/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 148/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 149/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 150/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 151/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 152/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 153/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 154/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 155/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 156/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 157/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 158/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 159/200
 - 1s - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 160/200
 - 1s - loss: 0.0156 - acc: 0.9948 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 161/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 162/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 163/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 164/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 165/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 166/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 167/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 168/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 169/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 170/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 171/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 172/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 173/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 174/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 175/200
 - 1s - loss: 0.0154 - acc: 0.9948 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 176/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 177/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 178/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 179/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 180/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 181/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 182/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 183/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 184/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 185/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9948
Epoch 186/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9948
Epoch 187/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9948
Epoch 188/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 189/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 190/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 191/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 192/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 193/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 194/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 195/200
 - 1s - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 196/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 197/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 198/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 199/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 200/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9950
2018-03-27 11:38:11,459 [INFO] Evaluate...
2018-03-27 11:38:15,295 [INFO] Done!
2018-03-27 11:38:15,302 [INFO] tpe_transform took 0.002469 seconds
2018-03-27 11:38:15,302 [INFO] TPE using 58/58 trials with best loss 0.011121
2018-03-27 11:38:15,310 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:38:16,296 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0567 - acc: 0.9800 - val_loss: 0.0277 - val_acc: 0.9930
Epoch 2/200
 - 1s - loss: 0.0253 - acc: 0.9917 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 3/200
 - 1s - loss: 0.0233 - acc: 0.9931 - val_loss: 0.0243 - val_acc: 0.9932
Epoch 4/200
 - 1s - loss: 0.0217 - acc: 0.9928 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 5/200
 - 1s - loss: 0.0211 - acc: 0.9932 - val_loss: 0.0233 - val_acc: 0.9934
Epoch 6/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0231 - val_acc: 0.9934
Epoch 7/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0227 - val_acc: 0.9936
Epoch 8/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0226 - val_acc: 0.9932
Epoch 9/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0225 - val_acc: 0.9932
Epoch 10/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0223 - val_acc: 0.9936
Epoch 11/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0222 - val_acc: 0.9936
Epoch 12/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0222 - val_acc: 0.9934
Epoch 13/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9934
Epoch 14/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9936
Epoch 15/200
 - 1s - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0219 - val_acc: 0.9936
Epoch 16/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0219 - val_acc: 0.9934
Epoch 17/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 18/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 19/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9936
Epoch 20/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 21/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0216 - val_acc: 0.9936
Epoch 22/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0215 - val_acc: 0.9936
Epoch 23/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0215 - val_acc: 0.9936
Epoch 24/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0214 - val_acc: 0.9936
Epoch 25/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0214 - val_acc: 0.9936
Epoch 26/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0213 - val_acc: 0.9936
Epoch 27/200
 - 1s - loss: 0.0178 - acc: 0.9949 - val_loss: 0.0213 - val_acc: 0.9936
Epoch 28/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0213 - val_acc: 0.9936
Epoch 29/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0213 - val_acc: 0.9936
Epoch 30/200
 - 1s - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0213 - val_acc: 0.9936
Epoch 31/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0212 - val_acc: 0.9936
Epoch 32/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0212 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0212 - val_acc: 0.9940
Epoch 34/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0211 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0211 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0211 - val_acc: 0.9940
Epoch 37/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0211 - val_acc: 0.9940
Epoch 38/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0211 - val_acc: 0.9940
Epoch 39/200
 - 1s - loss: 0.0171 - acc: 0.9941 - val_loss: 0.0211 - val_acc: 0.9940
Epoch 40/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0211 - val_acc: 0.9940
Epoch 41/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 42/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 43/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 44/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 45/200
 - 1s - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 47/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0210 - val_acc: 0.9940
Epoch 48/200
 - 1s - loss: 0.0169 - acc: 0.9952 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 49/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 51/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 52/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 53/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0209 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 56/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 57/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 58/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 59/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 60/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 61/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 64/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 67/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 68/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 69/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 70/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 71/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 72/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 73/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 74/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 75/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 76/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 77/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0207 - val_acc: 0.9940
Epoch 78/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 79/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 80/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 81/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 82/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 83/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 84/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 85/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 86/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 87/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 88/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 89/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 90/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0206 - val_acc: 0.9940
Epoch 91/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 92/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 93/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 94/200
 - 1s - loss: 0.0162 - acc: 0.9955 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 95/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 96/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 97/200
 - 1s - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 98/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 99/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 100/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 101/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 102/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 103/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 104/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 105/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 106/200
 - 1s - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 107/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 108/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 109/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 110/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 111/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 112/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 113/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 114/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0204 - val_acc: 0.9940
Epoch 115/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0205 - val_acc: 0.9942
Epoch 116/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 117/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 118/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0204 - val_acc: 0.9940
Epoch 119/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9940
Epoch 120/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 121/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0204 - val_acc: 0.9940
Epoch 122/200
 - 1s - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0204 - val_acc: 0.9940
Epoch 123/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 124/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0204 - val_acc: 0.9940
Epoch 125/200
 - 1s - loss: 0.0161 - acc: 0.9956 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 126/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 127/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 128/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 129/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 130/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 131/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 132/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 133/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 134/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 135/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 136/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 137/200
 - 1s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 138/200
 - 1s - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0204 - val_acc: 0.9942
Epoch 139/200
 - 1s - loss: 0.0159 - acc: 0.9948 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 140/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 141/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 142/200
 - 1s - loss: 0.0160 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 143/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 144/200
 - 1s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 145/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 146/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 147/200
 - 1s - loss: 0.0156 - acc: 0.9947 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 148/200
 - 1s - loss: 0.0156 - acc: 0.9956 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 149/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 150/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 151/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 152/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 153/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 154/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 155/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 156/200
 - 1s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 157/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 158/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 159/200
 - 1s - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 160/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 161/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 162/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 163/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 164/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 165/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 166/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 167/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 168/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 169/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 170/200
 - 1s - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 171/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 172/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 173/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 174/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9942
Epoch 175/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 176/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 177/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 178/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 179/200
 - 1s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 180/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 181/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 182/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 183/200
 - 1s - loss: 0.0160 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 184/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 185/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 186/200
 - 1s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 187/200
 - 1s - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 188/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 189/200
 - 1s - loss: 0.0157 - acc: 0.9948 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 190/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 191/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 192/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 193/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 194/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 195/200
 - 1s - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 196/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 197/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 198/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0202 - val_acc: 0.9942
Epoch 199/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9944
Epoch 200/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9944
2018-03-27 11:41:30,040 [INFO] Evaluate...
2018-03-27 11:41:33,884 [INFO] Done!
2018-03-27 11:41:33,892 [INFO] tpe_transform took 0.003273 seconds
2018-03-27 11:41:33,892 [INFO] TPE using 59/59 trials with best loss 0.011121
2018-03-27 11:41:33,901 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:41:34,888 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0394 - acc: 0.9869 - val_loss: 0.0223 - val_acc: 0.9940
Epoch 2/200
 - 1s - loss: 0.0192 - acc: 0.9932 - val_loss: 0.0206 - val_acc: 0.9948
Epoch 3/200
 - 1s - loss: 0.0175 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9942
Epoch 4/200
 - 1s - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 5/200
 - 1s - loss: 0.0158 - acc: 0.9948 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 6/200
 - 1s - loss: 0.0152 - acc: 0.9949 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 7/200
 - 1s - loss: 0.0150 - acc: 0.9948 - val_loss: 0.0187 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 9/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 12/200
 - 1s - loss: 0.0134 - acc: 0.9955 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 13/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 14/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 15/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 17/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 18/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 21/200
 - 1s - loss: 0.0123 - acc: 0.9960 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 22/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 23/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 24/200
 - 1s - loss: 0.0123 - acc: 0.9960 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 26/200
 - 1s - loss: 0.0126 - acc: 0.9964 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0126 - acc: 0.9958 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 28/200
 - 1s - loss: 0.0123 - acc: 0.9960 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 29/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 30/200
 - 1s - loss: 0.0119 - acc: 0.9966 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 31/200
 - 1s - loss: 0.0118 - acc: 0.9962 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 33/200
 - 1s - loss: 0.0120 - acc: 0.9965 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 34/200
 - 1s - loss: 0.0116 - acc: 0.9960 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0120 - acc: 0.9965 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 36/200
 - 1s - loss: 0.0119 - acc: 0.9962 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 37/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 38/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 39/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 40/200
 - 1s - loss: 0.0117 - acc: 0.9967 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 41/200
 - 1s - loss: 0.0115 - acc: 0.9964 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 42/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0114 - acc: 0.9965 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 44/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 45/200
 - 1s - loss: 0.0113 - acc: 0.9963 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0115 - acc: 0.9966 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 47/200
 - 1s - loss: 0.0111 - acc: 0.9966 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 48/200
 - 1s - loss: 0.0115 - acc: 0.9966 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0114 - acc: 0.9964 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 50/200
 - 1s - loss: 0.0112 - acc: 0.9967 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 51/200
 - 1s - loss: 0.0115 - acc: 0.9968 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 52/200
 - 1s - loss: 0.0112 - acc: 0.9964 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 53/200
 - 1s - loss: 0.0109 - acc: 0.9967 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 54/200
 - 1s - loss: 0.0112 - acc: 0.9968 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 55/200
 - 1s - loss: 0.0114 - acc: 0.9967 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 56/200
 - 1s - loss: 0.0114 - acc: 0.9963 - val_loss: 0.0180 - val_acc: 0.9948
2018-03-27 11:42:39,837 [INFO] Evaluate...
2018-03-27 11:42:43,732 [INFO] Done!
2018-03-27 11:42:43,738 [INFO] tpe_transform took 0.002483 seconds
2018-03-27 11:42:43,739 [INFO] TPE using 60/60 trials with best loss 0.011121
2018-03-27 11:42:43,747 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:42:44,731 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0895 - acc: 0.9691 - val_loss: 0.0438 - val_acc: 0.9880
Epoch 2/200
 - 1s - loss: 0.0384 - acc: 0.9897 - val_loss: 0.0390 - val_acc: 0.9898
Epoch 3/200
 - 1s - loss: 0.0354 - acc: 0.9898 - val_loss: 0.0368 - val_acc: 0.9902
Epoch 4/200
 - 1s - loss: 0.0333 - acc: 0.9909 - val_loss: 0.0355 - val_acc: 0.9910
Epoch 5/200
 - 1s - loss: 0.0321 - acc: 0.9914 - val_loss: 0.0347 - val_acc: 0.9916
Epoch 6/200
 - 1s - loss: 0.0312 - acc: 0.9917 - val_loss: 0.0340 - val_acc: 0.9916
Epoch 7/200
 - 1s - loss: 0.0305 - acc: 0.9919 - val_loss: 0.0335 - val_acc: 0.9920
Epoch 8/200
 - 1s - loss: 0.0301 - acc: 0.9921 - val_loss: 0.0330 - val_acc: 0.9920
Epoch 9/200
 - 1s - loss: 0.0296 - acc: 0.9919 - val_loss: 0.0327 - val_acc: 0.9920
Epoch 10/200
 - 1s - loss: 0.0293 - acc: 0.9919 - val_loss: 0.0324 - val_acc: 0.9922
Epoch 11/200
 - 1s - loss: 0.0291 - acc: 0.9921 - val_loss: 0.0321 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0285 - acc: 0.9917 - val_loss: 0.0319 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0317 - val_acc: 0.9924
Epoch 14/200
 - 1s - loss: 0.0285 - acc: 0.9922 - val_loss: 0.0315 - val_acc: 0.9924
Epoch 15/200
 - 1s - loss: 0.0280 - acc: 0.9923 - val_loss: 0.0314 - val_acc: 0.9924
Epoch 16/200
 - 1s - loss: 0.0282 - acc: 0.9929 - val_loss: 0.0312 - val_acc: 0.9924
Epoch 17/200
 - 1s - loss: 0.0277 - acc: 0.9922 - val_loss: 0.0311 - val_acc: 0.9924
Epoch 18/200
 - 1s - loss: 0.0273 - acc: 0.9928 - val_loss: 0.0310 - val_acc: 0.9924
Epoch 19/200
 - 1s - loss: 0.0273 - acc: 0.9931 - val_loss: 0.0308 - val_acc: 0.9924
Epoch 20/200
 - 1s - loss: 0.0272 - acc: 0.9925 - val_loss: 0.0307 - val_acc: 0.9924
Epoch 21/200
 - 1s - loss: 0.0271 - acc: 0.9929 - val_loss: 0.0306 - val_acc: 0.9924
Epoch 22/200
 - 1s - loss: 0.0267 - acc: 0.9928 - val_loss: 0.0305 - val_acc: 0.9924
Epoch 23/200
 - 1s - loss: 0.0268 - acc: 0.9929 - val_loss: 0.0304 - val_acc: 0.9924
Epoch 24/200
 - 1s - loss: 0.0269 - acc: 0.9928 - val_loss: 0.0304 - val_acc: 0.9924
Epoch 25/200
 - 1s - loss: 0.0267 - acc: 0.9927 - val_loss: 0.0303 - val_acc: 0.9924
Epoch 26/200
 - 1s - loss: 0.0264 - acc: 0.9929 - val_loss: 0.0302 - val_acc: 0.9924
Epoch 27/200
 - 1s - loss: 0.0262 - acc: 0.9930 - val_loss: 0.0301 - val_acc: 0.9924
Epoch 28/200
 - 1s - loss: 0.0265 - acc: 0.9927 - val_loss: 0.0301 - val_acc: 0.9924
Epoch 29/200
 - 1s - loss: 0.0264 - acc: 0.9931 - val_loss: 0.0300 - val_acc: 0.9924
Epoch 30/200
 - 1s - loss: 0.0267 - acc: 0.9926 - val_loss: 0.0299 - val_acc: 0.9924
Epoch 31/200
 - 1s - loss: 0.0265 - acc: 0.9929 - val_loss: 0.0299 - val_acc: 0.9924
Epoch 32/200
 - 1s - loss: 0.0263 - acc: 0.9926 - val_loss: 0.0298 - val_acc: 0.9924
Epoch 33/200
 - 1s - loss: 0.0261 - acc: 0.9931 - val_loss: 0.0298 - val_acc: 0.9924
Epoch 34/200
 - 1s - loss: 0.0264 - acc: 0.9926 - val_loss: 0.0297 - val_acc: 0.9924
Epoch 35/200
 - 1s - loss: 0.0266 - acc: 0.9927 - val_loss: 0.0297 - val_acc: 0.9924
Epoch 36/200
 - 1s - loss: 0.0261 - acc: 0.9927 - val_loss: 0.0296 - val_acc: 0.9924
Epoch 37/200
 - 1s - loss: 0.0265 - acc: 0.9927 - val_loss: 0.0296 - val_acc: 0.9924
Epoch 38/200
 - 1s - loss: 0.0262 - acc: 0.9926 - val_loss: 0.0295 - val_acc: 0.9924
Epoch 39/200
 - 1s - loss: 0.0259 - acc: 0.9929 - val_loss: 0.0295 - val_acc: 0.9924
Epoch 40/200
 - 1s - loss: 0.0261 - acc: 0.9925 - val_loss: 0.0294 - val_acc: 0.9924
Epoch 41/200
 - 1s - loss: 0.0259 - acc: 0.9933 - val_loss: 0.0294 - val_acc: 0.9924
Epoch 42/200
 - 1s - loss: 0.0257 - acc: 0.9931 - val_loss: 0.0293 - val_acc: 0.9924
Epoch 43/200
 - 1s - loss: 0.0253 - acc: 0.9930 - val_loss: 0.0293 - val_acc: 0.9926
Epoch 44/200
 - 1s - loss: 0.0261 - acc: 0.9927 - val_loss: 0.0293 - val_acc: 0.9926
Epoch 45/200
 - 1s - loss: 0.0259 - acc: 0.9931 - val_loss: 0.0292 - val_acc: 0.9926
Epoch 46/200
 - 1s - loss: 0.0255 - acc: 0.9927 - val_loss: 0.0292 - val_acc: 0.9926
Epoch 47/200
 - 1s - loss: 0.0257 - acc: 0.9927 - val_loss: 0.0292 - val_acc: 0.9926
Epoch 48/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0291 - val_acc: 0.9926
Epoch 49/200
 - 1s - loss: 0.0253 - acc: 0.9932 - val_loss: 0.0291 - val_acc: 0.9926
Epoch 50/200
 - 1s - loss: 0.0253 - acc: 0.9928 - val_loss: 0.0291 - val_acc: 0.9926
Epoch 51/200
 - 1s - loss: 0.0251 - acc: 0.9931 - val_loss: 0.0290 - val_acc: 0.9926
Epoch 52/200
 - 1s - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0290 - val_acc: 0.9928
Epoch 53/200
 - 1s - loss: 0.0252 - acc: 0.9928 - val_loss: 0.0290 - val_acc: 0.9928
Epoch 54/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0290 - val_acc: 0.9928
Epoch 55/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0289 - val_acc: 0.9928
Epoch 56/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0289 - val_acc: 0.9928
Epoch 57/200
 - 1s - loss: 0.0249 - acc: 0.9933 - val_loss: 0.0289 - val_acc: 0.9928
Epoch 58/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0288 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0288 - val_acc: 0.9928
Epoch 60/200
 - 1s - loss: 0.0251 - acc: 0.9933 - val_loss: 0.0288 - val_acc: 0.9928
Epoch 61/200
 - 1s - loss: 0.0253 - acc: 0.9929 - val_loss: 0.0288 - val_acc: 0.9928
Epoch 62/200
 - 1s - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0287 - val_acc: 0.9928
Epoch 63/200
 - 1s - loss: 0.0249 - acc: 0.9933 - val_loss: 0.0287 - val_acc: 0.9928
Epoch 64/200
 - 1s - loss: 0.0253 - acc: 0.9931 - val_loss: 0.0287 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0247 - acc: 0.9933 - val_loss: 0.0287 - val_acc: 0.9928
Epoch 66/200
 - 1s - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0287 - val_acc: 0.9928
Epoch 67/200
 - 1s - loss: 0.0250 - acc: 0.9927 - val_loss: 0.0286 - val_acc: 0.9928
Epoch 68/200
 - 1s - loss: 0.0248 - acc: 0.9935 - val_loss: 0.0286 - val_acc: 0.9928
Epoch 69/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0286 - val_acc: 0.9928
Epoch 70/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0286 - val_acc: 0.9928
Epoch 71/200
 - 1s - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0286 - val_acc: 0.9928
Epoch 72/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0285 - val_acc: 0.9928
Epoch 73/200
 - 1s - loss: 0.0247 - acc: 0.9929 - val_loss: 0.0285 - val_acc: 0.9928
Epoch 74/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0285 - val_acc: 0.9928
Epoch 75/200
 - 1s - loss: 0.0248 - acc: 0.9932 - val_loss: 0.0285 - val_acc: 0.9928
Epoch 76/200
 - 1s - loss: 0.0246 - acc: 0.9935 - val_loss: 0.0285 - val_acc: 0.9928
Epoch 77/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0284 - val_acc: 0.9928
Epoch 78/200
 - 1s - loss: 0.0246 - acc: 0.9936 - val_loss: 0.0284 - val_acc: 0.9928
Epoch 79/200
 - 1s - loss: 0.0243 - acc: 0.9933 - val_loss: 0.0284 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0248 - acc: 0.9929 - val_loss: 0.0284 - val_acc: 0.9928
Epoch 81/200
 - 1s - loss: 0.0245 - acc: 0.9935 - val_loss: 0.0284 - val_acc: 0.9928
Epoch 82/200
 - 1s - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0284 - val_acc: 0.9928
Epoch 83/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 84/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 85/200
 - 1s - loss: 0.0244 - acc: 0.9933 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 86/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 87/200
 - 1s - loss: 0.0245 - acc: 0.9933 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 88/200
 - 1s - loss: 0.0244 - acc: 0.9932 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 89/200
 - 1s - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0283 - val_acc: 0.9928
Epoch 90/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 91/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 92/200
 - 1s - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 93/200
 - 1s - loss: 0.0243 - acc: 0.9936 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 94/200
 - 1s - loss: 0.0243 - acc: 0.9932 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 95/200
 - 1s - loss: 0.0243 - acc: 0.9937 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 96/200
 - 1s - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0282 - val_acc: 0.9928
Epoch 97/200
 - 1s - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0281 - val_acc: 0.9928
Epoch 98/200
 - 1s - loss: 0.0247 - acc: 0.9931 - val_loss: 0.0281 - val_acc: 0.9928
Epoch 99/200
 - 1s - loss: 0.0242 - acc: 0.9937 - val_loss: 0.0281 - val_acc: 0.9928
Epoch 100/200
 - 1s - loss: 0.0239 - acc: 0.9936 - val_loss: 0.0281 - val_acc: 0.9928
Epoch 101/200
 - 1s - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 102/200
 - 1s - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 103/200
 - 1s - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 104/200
 - 1s - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0281 - val_acc: 0.9930
Epoch 105/200
 - 1s - loss: 0.0239 - acc: 0.9936 - val_loss: 0.0280 - val_acc: 0.9930
Epoch 106/200
 - 1s - loss: 0.0243 - acc: 0.9931 - val_loss: 0.0280 - val_acc: 0.9930
Epoch 107/200
 - 1s - loss: 0.0242 - acc: 0.9934 - val_loss: 0.0280 - val_acc: 0.9930
Epoch 108/200
 - 1s - loss: 0.0239 - acc: 0.9937 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 109/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 110/200
 - 1s - loss: 0.0243 - acc: 0.9936 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 111/200
 - 1s - loss: 0.0240 - acc: 0.9930 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 112/200
 - 1s - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 113/200
 - 1s - loss: 0.0239 - acc: 0.9937 - val_loss: 0.0280 - val_acc: 0.9932
Epoch 114/200
 - 1s - loss: 0.0239 - acc: 0.9938 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 115/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 116/200
 - 1s - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 117/200
 - 1s - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 118/200
 - 1s - loss: 0.0239 - acc: 0.9934 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 119/200
 - 1s - loss: 0.0239 - acc: 0.9936 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 120/200
 - 1s - loss: 0.0240 - acc: 0.9937 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 121/200
 - 1s - loss: 0.0238 - acc: 0.9938 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 122/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 123/200
 - 1s - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0279 - val_acc: 0.9932
Epoch 124/200
 - 1s - loss: 0.0242 - acc: 0.9932 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 125/200
 - 1s - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 126/200
 - 1s - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 127/200
 - 1s - loss: 0.0240 - acc: 0.9935 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 128/200
 - 1s - loss: 0.0235 - acc: 0.9936 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 129/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 130/200
 - 1s - loss: 0.0242 - acc: 0.9931 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 131/200
 - 1s - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 132/200
 - 1s - loss: 0.0237 - acc: 0.9934 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 133/200
 - 1s - loss: 0.0241 - acc: 0.9934 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 134/200
 - 1s - loss: 0.0235 - acc: 0.9935 - val_loss: 0.0278 - val_acc: 0.9932
Epoch 135/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 136/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 137/200
 - 1s - loss: 0.0241 - acc: 0.9930 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 138/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 139/200
 - 1s - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0238 - acc: 0.9932 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 142/200
 - 1s - loss: 0.0235 - acc: 0.9933 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 143/200
 - 1s - loss: 0.0238 - acc: 0.9933 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 144/200
 - 1s - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 145/200
 - 1s - loss: 0.0240 - acc: 0.9932 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0277 - val_acc: 0.9932
Epoch 147/200
 - 1s - loss: 0.0240 - acc: 0.9934 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 148/200
 - 1s - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0236 - acc: 0.9938 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 152/200
 - 1s - loss: 0.0241 - acc: 0.9933 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 153/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 154/200
 - 1s - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0238 - acc: 0.9934 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 156/200
 - 1s - loss: 0.0236 - acc: 0.9938 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 157/200
 - 1s - loss: 0.0237 - acc: 0.9933 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 158/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 159/200
 - 1s - loss: 0.0237 - acc: 0.9934 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0241 - acc: 0.9936 - val_loss: 0.0276 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0237 - acc: 0.9937 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 163/200
 - 1s - loss: 0.0237 - acc: 0.9935 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 164/200
 - 1s - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 165/200
 - 1s - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 166/200
 - 1s - loss: 0.0239 - acc: 0.9934 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 167/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 168/200
 - 1s - loss: 0.0236 - acc: 0.9937 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 169/200
 - 1s - loss: 0.0235 - acc: 0.9935 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 170/200
 - 1s - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 171/200
 - 1s - loss: 0.0233 - acc: 0.9936 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 172/200
 - 1s - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 173/200
 - 1s - loss: 0.0235 - acc: 0.9934 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 174/200
 - 1s - loss: 0.0235 - acc: 0.9934 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 175/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0275 - val_acc: 0.9932
Epoch 176/200
 - 1s - loss: 0.0232 - acc: 0.9934 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 177/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 178/200
 - 1s - loss: 0.0237 - acc: 0.9937 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 179/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 180/200
 - 1s - loss: 0.0233 - acc: 0.9934 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 181/200
 - 1s - loss: 0.0234 - acc: 0.9933 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 182/200
 - 1s - loss: 0.0237 - acc: 0.9937 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 183/200
 - 1s - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 184/200
 - 1s - loss: 0.0235 - acc: 0.9934 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 185/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 186/200
 - 1s - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 187/200
 - 1s - loss: 0.0234 - acc: 0.9932 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 188/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 189/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 190/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 191/200
 - 1s - loss: 0.0235 - acc: 0.9933 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 192/200
 - 1s - loss: 0.0233 - acc: 0.9935 - val_loss: 0.0274 - val_acc: 0.9932
Epoch 193/200
 - 1s - loss: 0.0236 - acc: 0.9935 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 194/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 195/200
 - 1s - loss: 0.0231 - acc: 0.9935 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 196/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 197/200
 - 1s - loss: 0.0237 - acc: 0.9935 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 198/200
 - 1s - loss: 0.0233 - acc: 0.9937 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 199/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0273 - val_acc: 0.9932
Epoch 200/200
 - 1s - loss: 0.0230 - acc: 0.9940 - val_loss: 0.0273 - val_acc: 0.9932
2018-03-27 11:45:59,677 [INFO] Evaluate...
2018-03-27 11:46:03,682 [INFO] Done!
2018-03-27 11:46:03,688 [INFO] tpe_transform took 0.002480 seconds
2018-03-27 11:46:03,689 [INFO] TPE using 61/61 trials with best loss 0.011121
2018-03-27 11:46:03,697 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:46:04,685 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 6s - loss: 0.0392 - acc: 0.9864 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 2/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 3/200
 - 1s - loss: 0.0157 - acc: 0.9948 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 4/200
 - 1s - loss: 0.0151 - acc: 0.9950 - val_loss: 0.0191 - val_acc: 0.9934
Epoch 5/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0186 - val_acc: 0.9940
Epoch 6/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 7/200
 - 1s - loss: 0.0132 - acc: 0.9955 - val_loss: 0.0183 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0127 - acc: 0.9961 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0120 - acc: 0.9966 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 10/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0179 - val_acc: 0.9934
Epoch 11/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0179 - val_acc: 0.9934
Epoch 12/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 13/200
 - 1s - loss: 0.0115 - acc: 0.9967 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 14/200
 - 1s - loss: 0.0113 - acc: 0.9965 - val_loss: 0.0178 - val_acc: 0.9938
Epoch 15/200
 - 1s - loss: 0.0112 - acc: 0.9969 - val_loss: 0.0177 - val_acc: 0.9934
Epoch 16/200
 - 1s - loss: 0.0108 - acc: 0.9975 - val_loss: 0.0177 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0109 - acc: 0.9970 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 18/200
 - 1s - loss: 0.0107 - acc: 0.9971 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 19/200
 - 1s - loss: 0.0107 - acc: 0.9970 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 20/200
 - 1s - loss: 0.0108 - acc: 0.9972 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 21/200
 - 1s - loss: 0.0105 - acc: 0.9970 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 22/200
 - 1s - loss: 0.0105 - acc: 0.9970 - val_loss: 0.0174 - val_acc: 0.9936
Epoch 23/200
 - 1s - loss: 0.0105 - acc: 0.9972 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0111 - acc: 0.9966 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 25/200
 - 1s - loss: 0.0104 - acc: 0.9970 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 26/200
 - 1s - loss: 0.0103 - acc: 0.9972 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 27/200
 - 1s - loss: 0.0099 - acc: 0.9973 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0101 - acc: 0.9972 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0097 - acc: 0.9978 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 30/200
 - 1s - loss: 0.0098 - acc: 0.9973 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0102 - acc: 0.9972 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0099 - acc: 0.9977 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0101 - acc: 0.9972 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 34/200
 - 1s - loss: 0.0100 - acc: 0.9971 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 35/200
 - 1s - loss: 0.0095 - acc: 0.9974 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 36/200
 - 1s - loss: 0.0098 - acc: 0.9977 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 37/200
 - 1s - loss: 0.0097 - acc: 0.9976 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 38/200
 - 1s - loss: 0.0096 - acc: 0.9973 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 39/200
 - 1s - loss: 0.0100 - acc: 0.9973 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 40/200
 - 1s - loss: 0.0096 - acc: 0.9976 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0094 - acc: 0.9975 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 42/200
 - 1s - loss: 0.0095 - acc: 0.9976 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 43/200
 - 1s - loss: 0.0097 - acc: 0.9973 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 44/200
 - 1s - loss: 0.0096 - acc: 0.9975 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 45/200
 - 1s - loss: 0.0094 - acc: 0.9976 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0093 - acc: 0.9978 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 47/200
 - 1s - loss: 0.0091 - acc: 0.9978 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 48/200
 - 1s - loss: 0.0095 - acc: 0.9976 - val_loss: 0.0172 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0093 - acc: 0.9973 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 50/200
 - 1s - loss: 0.0094 - acc: 0.9973 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 51/200
 - 1s - loss: 0.0096 - acc: 0.9973 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 52/200
 - 1s - loss: 0.0092 - acc: 0.9974 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 53/200
 - 1s - loss: 0.0092 - acc: 0.9978 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 54/200
 - 1s - loss: 0.0095 - acc: 0.9976 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 55/200
 - 1s - loss: 0.0094 - acc: 0.9974 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 56/200
 - 1s - loss: 0.0093 - acc: 0.9979 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 57/200
 - 1s - loss: 0.0093 - acc: 0.9975 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 58/200
 - 1s - loss: 0.0091 - acc: 0.9977 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 59/200
 - 1s - loss: 0.0092 - acc: 0.9978 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 60/200
 - 1s - loss: 0.0092 - acc: 0.9977 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 61/200
 - 1s - loss: 0.0091 - acc: 0.9976 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 62/200
 - 1s - loss: 0.0091 - acc: 0.9976 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 63/200
 - 1s - loss: 0.0090 - acc: 0.9975 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 64/200
 - 1s - loss: 0.0089 - acc: 0.9981 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 65/200
 - 1s - loss: 0.0093 - acc: 0.9976 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0088 - acc: 0.9976 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 67/200
 - 1s - loss: 0.0089 - acc: 0.9979 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 68/200
 - 1s - loss: 0.0092 - acc: 0.9978 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0090 - acc: 0.9978 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0089 - acc: 0.9977 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0089 - acc: 0.9980 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0091 - acc: 0.9976 - val_loss: 0.0171 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0090 - acc: 0.9976 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 74/200
 - 1s - loss: 0.0089 - acc: 0.9974 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 75/200
 - 1s - loss: 0.0089 - acc: 0.9978 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 76/200
 - 1s - loss: 0.0086 - acc: 0.9979 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0086 - acc: 0.9978 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 78/200
 - 1s - loss: 0.0089 - acc: 0.9978 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0088 - acc: 0.9976 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0085 - acc: 0.9978 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 81/200
 - 1s - loss: 0.0087 - acc: 0.9977 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0088 - acc: 0.9978 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0084 - acc: 0.9983 - val_loss: 0.0170 - val_acc: 0.9944
2018-03-27 11:47:34,713 [INFO] Evaluate...
2018-03-27 11:47:38,696 [INFO] Done!
2018-03-27 11:47:38,703 [INFO] tpe_transform took 0.003277 seconds
2018-03-27 11:47:38,704 [INFO] TPE using 62/62 trials with best loss 0.011121
2018-03-27 11:47:38,711 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:47:39,699 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0434 - acc: 0.9842 - val_loss: 0.0201 - val_acc: 0.9936
Epoch 2/200
 - 1s - loss: 0.0240 - acc: 0.9917 - val_loss: 0.0189 - val_acc: 0.9940
Epoch 3/200
 - 1s - loss: 0.0225 - acc: 0.9923 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 4/200
 - 1s - loss: 0.0219 - acc: 0.9925 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0220 - acc: 0.9922 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 6/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0176 - val_acc: 0.9938
Epoch 7/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0212 - acc: 0.9928 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 9/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 10/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0204 - acc: 0.9929 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 13/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0200 - acc: 0.9936 - val_loss: 0.0170 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0169 - val_acc: 0.9944
Epoch 17/200
 - 1s - loss: 0.0199 - acc: 0.9934 - val_loss: 0.0168 - val_acc: 0.9942
Epoch 18/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0168 - val_acc: 0.9944
Epoch 19/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 21/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0199 - acc: 0.9932 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 23/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0167 - val_acc: 0.9944
Epoch 24/200
 - 1s - loss: 0.0196 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0191 - acc: 0.9938 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0193 - acc: 0.9934 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 37/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 39/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0188 - acc: 0.9939 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0191 - acc: 0.9938 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 47/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 48/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0189 - acc: 0.9934 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 51/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 52/200
 - 1s - loss: 0.0185 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 53/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0189 - acc: 0.9942 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 56/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 57/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 59/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0162 - val_acc: 0.9946
Epoch 60/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 61/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 62/200
 - 1s - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 63/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 64/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 65/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 67/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 68/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0185 - acc: 0.9939 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 70/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 72/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0161 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 74/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 75/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 76/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 78/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0183 - acc: 0.9939 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 81/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 84/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 85/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 86/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9946
Epoch 87/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 88/200
 - 1s - loss: 0.0180 - acc: 0.9939 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 89/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 90/200
 - 1s - loss: 0.0181 - acc: 0.9939 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 91/200
 - 1s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 92/200
 - 1s - loss: 0.0183 - acc: 0.9938 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 93/200
 - 1s - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 94/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9944
Epoch 95/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9944
Epoch 96/200
 - 1s - loss: 0.0182 - acc: 0.9939 - val_loss: 0.0159 - val_acc: 0.9944
Epoch 97/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 98/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 99/200
 - 1s - loss: 0.0183 - acc: 0.9939 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 100/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 101/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 102/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 103/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 104/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 105/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 106/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 107/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 108/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 109/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 110/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 111/200
 - 1s - loss: 0.0184 - acc: 0.9939 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 112/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 113/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 114/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 115/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 116/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 117/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 118/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 119/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 120/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 121/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0159 - val_acc: 0.9948
Epoch 122/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 123/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 124/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 125/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 126/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 127/200
 - 1s - loss: 0.0183 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 128/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 129/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 130/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 131/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 132/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 133/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 134/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 135/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 136/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 137/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 138/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 139/200
 - 1s - loss: 0.0178 - acc: 0.9938 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 140/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 141/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 142/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 143/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 144/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 145/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 146/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 147/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 148/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 149/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 150/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 151/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 152/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 153/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 154/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 155/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 156/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 157/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 158/200
 - 1s - loss: 0.0179 - acc: 0.9943 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 159/200
 - 1s - loss: 0.0179 - acc: 0.9948 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 160/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 161/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 162/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 163/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 164/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 165/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 166/200
 - 1s - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 167/200
 - 1s - loss: 0.0177 - acc: 0.9943 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 168/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 169/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 170/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 171/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 172/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 173/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 174/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 175/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 176/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 177/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 178/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 179/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 180/200
 - 1s - loss: 0.0177 - acc: 0.9948 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 181/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 182/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 183/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 184/200
 - 1s - loss: 0.0179 - acc: 0.9943 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 185/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 186/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 187/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 188/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 189/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 190/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 191/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 192/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 193/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 194/200
 - 1s - loss: 0.0174 - acc: 0.9942 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 195/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 196/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 197/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 198/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 199/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
Epoch 200/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0157 - val_acc: 0.9948
2018-03-27 11:50:55,539 [INFO] Evaluate...
2018-03-27 11:50:59,628 [INFO] Done!
2018-03-27 11:50:59,634 [INFO] tpe_transform took 0.002508 seconds
2018-03-27 11:50:59,635 [INFO] TPE using 63/63 trials with best loss 0.011121
2018-03-27 11:50:59,643 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:51:00,642 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0656 - acc: 0.9786 - val_loss: 0.0390 - val_acc: 0.9886
Epoch 2/200
 - 1s - loss: 0.0311 - acc: 0.9910 - val_loss: 0.0367 - val_acc: 0.9890
Epoch 3/200
 - 1s - loss: 0.0284 - acc: 0.9920 - val_loss: 0.0352 - val_acc: 0.9890
Epoch 4/200
 - 1s - loss: 0.0276 - acc: 0.9922 - val_loss: 0.0345 - val_acc: 0.9894
Epoch 5/200
 - 1s - loss: 0.0272 - acc: 0.9924 - val_loss: 0.0340 - val_acc: 0.9896
Epoch 6/200
 - 1s - loss: 0.0263 - acc: 0.9926 - val_loss: 0.0336 - val_acc: 0.9896
Epoch 7/200
 - 1s - loss: 0.0260 - acc: 0.9928 - val_loss: 0.0333 - val_acc: 0.9896
Epoch 8/200
 - 1s - loss: 0.0257 - acc: 0.9933 - val_loss: 0.0330 - val_acc: 0.9896
Epoch 9/200
 - 1s - loss: 0.0251 - acc: 0.9933 - val_loss: 0.0328 - val_acc: 0.9896
Epoch 10/200
 - 1s - loss: 0.0253 - acc: 0.9929 - val_loss: 0.0326 - val_acc: 0.9898
Epoch 11/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0325 - val_acc: 0.9898
Epoch 12/200
 - 1s - loss: 0.0249 - acc: 0.9928 - val_loss: 0.0323 - val_acc: 0.9900
Epoch 13/200
 - 1s - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0321 - val_acc: 0.9902
Epoch 14/200
 - 1s - loss: 0.0240 - acc: 0.9936 - val_loss: 0.0320 - val_acc: 0.9904
Epoch 15/200
 - 1s - loss: 0.0244 - acc: 0.9928 - val_loss: 0.0319 - val_acc: 0.9904
Epoch 16/200
 - 1s - loss: 0.0241 - acc: 0.9935 - val_loss: 0.0318 - val_acc: 0.9904
Epoch 17/200
 - 1s - loss: 0.0239 - acc: 0.9932 - val_loss: 0.0317 - val_acc: 0.9904
Epoch 18/200
 - 1s - loss: 0.0243 - acc: 0.9930 - val_loss: 0.0317 - val_acc: 0.9904
Epoch 19/200
 - 1s - loss: 0.0241 - acc: 0.9940 - val_loss: 0.0316 - val_acc: 0.9904
Epoch 20/200
 - 1s - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0315 - val_acc: 0.9904
Epoch 21/200
 - 1s - loss: 0.0237 - acc: 0.9932 - val_loss: 0.0314 - val_acc: 0.9904
Epoch 22/200
 - 1s - loss: 0.0234 - acc: 0.9932 - val_loss: 0.0314 - val_acc: 0.9904
Epoch 23/200
 - 1s - loss: 0.0234 - acc: 0.9935 - val_loss: 0.0313 - val_acc: 0.9904
Epoch 24/200
 - 1s - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0313 - val_acc: 0.9904
Epoch 25/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0312 - val_acc: 0.9904
Epoch 26/200
 - 1s - loss: 0.0237 - acc: 0.9931 - val_loss: 0.0312 - val_acc: 0.9908
Epoch 27/200
 - 1s - loss: 0.0233 - acc: 0.9937 - val_loss: 0.0311 - val_acc: 0.9908
Epoch 28/200
 - 1s - loss: 0.0230 - acc: 0.9938 - val_loss: 0.0311 - val_acc: 0.9908
Epoch 29/200
 - 1s - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0310 - val_acc: 0.9908
Epoch 30/200
 - 1s - loss: 0.0234 - acc: 0.9938 - val_loss: 0.0310 - val_acc: 0.9908
Epoch 31/200
 - 1s - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0309 - val_acc: 0.9908
Epoch 32/200
 - 1s - loss: 0.0231 - acc: 0.9935 - val_loss: 0.0309 - val_acc: 0.9908
Epoch 33/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0309 - val_acc: 0.9908
Epoch 34/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0308 - val_acc: 0.9908
Epoch 35/200
 - 1s - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0308 - val_acc: 0.9908
Epoch 36/200
 - 1s - loss: 0.0231 - acc: 0.9933 - val_loss: 0.0307 - val_acc: 0.9908
Epoch 37/200
 - 1s - loss: 0.0230 - acc: 0.9935 - val_loss: 0.0307 - val_acc: 0.9910
Epoch 38/200
 - 1s - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0307 - val_acc: 0.9910
Epoch 39/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0307 - val_acc: 0.9910
Epoch 40/200
 - 1s - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0306 - val_acc: 0.9910
Epoch 41/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0306 - val_acc: 0.9910
Epoch 42/200
 - 1s - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0306 - val_acc: 0.9910
Epoch 43/200
 - 1s - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0305 - val_acc: 0.9910
Epoch 44/200
 - 1s - loss: 0.0228 - acc: 0.9934 - val_loss: 0.0305 - val_acc: 0.9910
Epoch 45/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0305 - val_acc: 0.9910
Epoch 46/200
 - 1s - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0305 - val_acc: 0.9910
Epoch 47/200
 - 1s - loss: 0.0224 - acc: 0.9940 - val_loss: 0.0304 - val_acc: 0.9910
Epoch 48/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0304 - val_acc: 0.9910
Epoch 49/200
 - 1s - loss: 0.0223 - acc: 0.9935 - val_loss: 0.0304 - val_acc: 0.9910
Epoch 50/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0304 - val_acc: 0.9910
Epoch 51/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0304 - val_acc: 0.9910
Epoch 52/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0303 - val_acc: 0.9912
Epoch 53/200
 - 1s - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0303 - val_acc: 0.9912
Epoch 54/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0303 - val_acc: 0.9912
Epoch 55/200
 - 1s - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0303 - val_acc: 0.9912
Epoch 56/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0302 - val_acc: 0.9912
Epoch 57/200
 - 1s - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0302 - val_acc: 0.9912
Epoch 58/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0302 - val_acc: 0.9912
Epoch 59/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0302 - val_acc: 0.9912
Epoch 60/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0302 - val_acc: 0.9912
Epoch 61/200
 - 1s - loss: 0.0220 - acc: 0.9940 - val_loss: 0.0302 - val_acc: 0.9912
Epoch 62/200
 - 1s - loss: 0.0224 - acc: 0.9938 - val_loss: 0.0301 - val_acc: 0.9914
Epoch 63/200
 - 1s - loss: 0.0226 - acc: 0.9932 - val_loss: 0.0301 - val_acc: 0.9912
Epoch 64/200
 - 1s - loss: 0.0224 - acc: 0.9940 - val_loss: 0.0301 - val_acc: 0.9914
Epoch 65/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0301 - val_acc: 0.9914
Epoch 66/200
 - 1s - loss: 0.0221 - acc: 0.9941 - val_loss: 0.0301 - val_acc: 0.9912
Epoch 67/200
 - 1s - loss: 0.0223 - acc: 0.9935 - val_loss: 0.0301 - val_acc: 0.9914
Epoch 68/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 69/200
 - 1s - loss: 0.0225 - acc: 0.9935 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 70/200
 - 1s - loss: 0.0223 - acc: 0.9935 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 71/200
 - 1s - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 72/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 73/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 74/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0300 - val_acc: 0.9914
Epoch 75/200
 - 1s - loss: 0.0224 - acc: 0.9932 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 76/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 77/200
 - 1s - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 78/200
 - 1s - loss: 0.0220 - acc: 0.9938 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 79/200
 - 1s - loss: 0.0214 - acc: 0.9941 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 80/200
 - 1s - loss: 0.0215 - acc: 0.9941 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 81/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 82/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 83/200
 - 1s - loss: 0.0221 - acc: 0.9940 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 84/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 85/200
 - 1s - loss: 0.0220 - acc: 0.9936 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 86/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 87/200
 - 1s - loss: 0.0222 - acc: 0.9939 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 88/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 89/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 90/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 91/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 92/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0298 - val_acc: 0.9914
Epoch 93/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 94/200
 - 1s - loss: 0.0216 - acc: 0.9941 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 95/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 96/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 97/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 98/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 99/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 100/200
 - 1s - loss: 0.0216 - acc: 0.9934 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 101/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 102/200
 - 1s - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 103/200
 - 1s - loss: 0.0214 - acc: 0.9944 - val_loss: 0.0297 - val_acc: 0.9914
Epoch 104/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 105/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 106/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 107/200
 - 1s - loss: 0.0213 - acc: 0.9941 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 108/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 109/200
 - 1s - loss: 0.0215 - acc: 0.9940 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 110/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 111/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 112/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 113/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 114/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 115/200
 - 1s - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0296 - val_acc: 0.9916
Epoch 116/200
 - 1s - loss: 0.0213 - acc: 0.9939 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 117/200
 - 1s - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 118/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 119/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 120/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 121/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 122/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 123/200
 - 1s - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 124/200
 - 1s - loss: 0.0214 - acc: 0.9940 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 125/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 126/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 127/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 128/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 129/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 130/200
 - 1s - loss: 0.0215 - acc: 0.9940 - val_loss: 0.0295 - val_acc: 0.9916
Epoch 131/200
 - 1s - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 132/200
 - 1s - loss: 0.0212 - acc: 0.9939 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 133/200
 - 1s - loss: 0.0211 - acc: 0.9944 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 134/200
 - 1s - loss: 0.0214 - acc: 0.9941 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 135/200
 - 1s - loss: 0.0208 - acc: 0.9941 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 136/200
 - 1s - loss: 0.0211 - acc: 0.9942 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 137/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 138/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 139/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 140/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 141/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 142/200
 - 1s - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 143/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 144/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 145/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 146/200
 - 1s - loss: 0.0213 - acc: 0.9941 - val_loss: 0.0294 - val_acc: 0.9916
Epoch 147/200
 - 1s - loss: 0.0213 - acc: 0.9942 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 148/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 149/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 150/200
 - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 151/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 152/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 153/200
 - 1s - loss: 0.0211 - acc: 0.9943 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 154/200
 - 1s - loss: 0.0213 - acc: 0.9939 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 155/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 156/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 157/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 158/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 159/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 160/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 161/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 162/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 163/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 164/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0293 - val_acc: 0.9916
Epoch 165/200
 - 1s - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 166/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 167/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 168/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 169/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 170/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 171/200
 - 1s - loss: 0.0211 - acc: 0.9941 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 172/200
 - 1s - loss: 0.0213 - acc: 0.9939 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 173/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 174/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 175/200
 - 1s - loss: 0.0213 - acc: 0.9940 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 176/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 177/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 178/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 179/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 180/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 181/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 182/200
 - 1s - loss: 0.0209 - acc: 0.9944 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 183/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 184/200
 - 1s - loss: 0.0209 - acc: 0.9944 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 185/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 186/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0292 - val_acc: 0.9916
Epoch 187/200
 - 1s - loss: 0.0209 - acc: 0.9942 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 188/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 189/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 190/200
 - 1s - loss: 0.0211 - acc: 0.9944 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 191/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 192/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 193/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 194/200
 - 1s - loss: 0.0208 - acc: 0.9941 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 195/200
 - 1s - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 196/200
 - 1s - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 197/200
 - 1s - loss: 0.0209 - acc: 0.9939 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 198/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 199/200
 - 1s - loss: 0.0212 - acc: 0.9939 - val_loss: 0.0291 - val_acc: 0.9916
Epoch 200/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0291 - val_acc: 0.9916
2018-03-27 11:54:17,100 [INFO] Evaluate...
2018-03-27 11:54:21,253 [INFO] Done!
2018-03-27 11:54:21,260 [INFO] tpe_transform took 0.002521 seconds
2018-03-27 11:54:21,261 [INFO] TPE using 64/64 trials with best loss 0.011121
2018-03-27 11:54:21,269 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:54:22,267 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0577 - acc: 0.9801 - val_loss: 0.0316 - val_acc: 0.9896
Epoch 2/200
 - 1s - loss: 0.0271 - acc: 0.9922 - val_loss: 0.0281 - val_acc: 0.9904
Epoch 3/200
 - 1s - loss: 0.0243 - acc: 0.9936 - val_loss: 0.0266 - val_acc: 0.9910
Epoch 4/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0265 - val_acc: 0.9902
Epoch 5/200
 - 1s - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0252 - val_acc: 0.9908
Epoch 6/200
 - 1s - loss: 0.0215 - acc: 0.9942 - val_loss: 0.0249 - val_acc: 0.9906
Epoch 7/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0247 - val_acc: 0.9906
Epoch 8/200
 - 1s - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0245 - val_acc: 0.9910
Epoch 9/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0244 - val_acc: 0.9910
Epoch 10/200
 - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0242 - val_acc: 0.9910
Epoch 11/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0241 - val_acc: 0.9912
Epoch 12/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0238 - val_acc: 0.9910
Epoch 13/200
 - 1s - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0237 - val_acc: 0.9912
Epoch 14/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0235 - val_acc: 0.9914
Epoch 15/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0235 - val_acc: 0.9910
Epoch 16/200
 - 1s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0234 - val_acc: 0.9914
Epoch 17/200
 - 1s - loss: 0.0186 - acc: 0.9948 - val_loss: 0.0233 - val_acc: 0.9914
Epoch 18/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0233 - val_acc: 0.9912
Epoch 19/200
 - 1s - loss: 0.0187 - acc: 0.9947 - val_loss: 0.0232 - val_acc: 0.9910
Epoch 20/200
 - 1s - loss: 0.0185 - acc: 0.9945 - val_loss: 0.0231 - val_acc: 0.9914
Epoch 21/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0231 - val_acc: 0.9914
Epoch 22/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0230 - val_acc: 0.9914
Epoch 23/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0230 - val_acc: 0.9914
Epoch 24/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0229 - val_acc: 0.9916
Epoch 25/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0229 - val_acc: 0.9912
Epoch 26/200
 - 1s - loss: 0.0180 - acc: 0.9949 - val_loss: 0.0229 - val_acc: 0.9916
Epoch 27/200
 - 1s - loss: 0.0179 - acc: 0.9951 - val_loss: 0.0229 - val_acc: 0.9912
Epoch 28/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0228 - val_acc: 0.9914
Epoch 29/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0228 - val_acc: 0.9914
Epoch 30/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0227 - val_acc: 0.9914
Epoch 31/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0227 - val_acc: 0.9916
Epoch 32/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0227 - val_acc: 0.9914
Epoch 33/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0227 - val_acc: 0.9914
Epoch 34/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0226 - val_acc: 0.9914
Epoch 35/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0226 - val_acc: 0.9914
Epoch 36/200
 - 1s - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0226 - val_acc: 0.9914
Epoch 37/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0225 - val_acc: 0.9916
Epoch 38/200
 - 1s - loss: 0.0176 - acc: 0.9943 - val_loss: 0.0225 - val_acc: 0.9918
Epoch 39/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0225 - val_acc: 0.9916
Epoch 40/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0225 - val_acc: 0.9918
Epoch 41/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 42/200
 - 1s - loss: 0.0171 - acc: 0.9952 - val_loss: 0.0224 - val_acc: 0.9916
Epoch 43/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 44/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 45/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0223 - val_acc: 0.9918
Epoch 46/200
 - 1s - loss: 0.0178 - acc: 0.9943 - val_loss: 0.0223 - val_acc: 0.9918
Epoch 47/200
 - 1s - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0223 - val_acc: 0.9918
Epoch 48/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0223 - val_acc: 0.9918
Epoch 49/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0223 - val_acc: 0.9918
Epoch 50/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0223 - val_acc: 0.9918
Epoch 51/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 52/200
 - 1s - loss: 0.0174 - acc: 0.9940 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 53/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 54/200
 - 1s - loss: 0.0175 - acc: 0.9948 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 55/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 56/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 57/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 58/200
 - 1s - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 59/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 60/200
 - 1s - loss: 0.0173 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 61/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 62/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 63/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 64/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 65/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 66/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 67/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0221 - val_acc: 0.9918
Epoch 68/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 69/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 70/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 71/200
 - 1s - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 72/200
 - 1s - loss: 0.0168 - acc: 0.9955 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 73/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 74/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 75/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 76/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 77/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 78/200
 - 1s - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 79/200
 - 1s - loss: 0.0170 - acc: 0.9944 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 80/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 81/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 82/200
 - 1s - loss: 0.0165 - acc: 0.9955 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 83/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 84/200
 - 1s - loss: 0.0168 - acc: 0.9953 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 85/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 86/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 87/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 88/200
 - 1s - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 89/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 90/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 91/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 92/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 93/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 94/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 95/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 96/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 97/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 98/200
 - 1s - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 99/200
 - 1s - loss: 0.0163 - acc: 0.9954 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 100/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 101/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 102/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 103/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 104/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 105/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 106/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 107/200
 - 1s - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 108/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 109/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0217 - val_acc: 0.9918
Epoch 110/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 111/200
 - 1s - loss: 0.0160 - acc: 0.9955 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 112/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 113/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 114/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0217 - val_acc: 0.9918
Epoch 115/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 116/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0217 - val_acc: 0.9918
Epoch 117/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 118/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 119/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 120/200
 - 1s - loss: 0.0162 - acc: 0.9956 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 121/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 122/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 123/200
 - 1s - loss: 0.0163 - acc: 0.9952 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 124/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 125/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 126/200
 - 1s - loss: 0.0160 - acc: 0.9954 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 127/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 128/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 129/200
 - 1s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 130/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 131/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 132/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 133/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 134/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 135/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 136/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 137/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 138/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 139/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 140/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 141/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 142/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 143/200
 - 1s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 144/200
 - 1s - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 145/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 146/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 147/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 148/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 149/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 150/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 151/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 152/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 153/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 154/200
 - 1s - loss: 0.0160 - acc: 0.9948 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 155/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 156/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 157/200
 - 1s - loss: 0.0160 - acc: 0.9954 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 158/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 159/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 160/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 161/200
 - 1s - loss: 0.0159 - acc: 0.9956 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 162/200
 - 1s - loss: 0.0157 - acc: 0.9956 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 163/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 164/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 165/200
 - 1s - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 166/200
 - 1s - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 167/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 168/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 169/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 170/200
 - 1s - loss: 0.0157 - acc: 0.9956 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 171/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 172/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 173/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 174/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 175/200
 - 1s - loss: 0.0154 - acc: 0.9957 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 176/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 177/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 178/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 179/200
 - 1s - loss: 0.0157 - acc: 0.9956 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 180/200
 - 1s - loss: 0.0153 - acc: 0.9956 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 181/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0215 - val_acc: 0.9920
Epoch 182/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0215 - val_acc: 0.9922
Epoch 183/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0215 - val_acc: 0.9922
Epoch 184/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0214 - val_acc: 0.9920
Epoch 185/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9920
Epoch 186/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0214 - val_acc: 0.9920
Epoch 187/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 188/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0214 - val_acc: 0.9920
Epoch 189/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9920
Epoch 190/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 191/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 192/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 193/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 194/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 195/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 196/200
 - 1s - loss: 0.0159 - acc: 0.9952 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 197/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 198/200
 - 1s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 199/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0214 - val_acc: 0.9922
Epoch 200/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0214 - val_acc: 0.9922
2018-03-27 11:57:38,721 [INFO] Evaluate...
2018-03-27 11:57:42,882 [INFO] Done!
2018-03-27 11:57:42,889 [INFO] tpe_transform took 0.003226 seconds
2018-03-27 11:57:42,890 [INFO] TPE using 65/65 trials with best loss 0.011121
2018-03-27 11:57:42,897 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 11:57:43,887 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0957 - acc: 0.9695 - val_loss: 0.0356 - val_acc: 0.9900
Epoch 2/200
 - 1s - loss: 0.0391 - acc: 0.9897 - val_loss: 0.0289 - val_acc: 0.9908
Epoch 3/200
 - 1s - loss: 0.0317 - acc: 0.9914 - val_loss: 0.0264 - val_acc: 0.9906
Epoch 4/200
 - 1s - loss: 0.0292 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9914
Epoch 5/200
 - 1s - loss: 0.0278 - acc: 0.9913 - val_loss: 0.0235 - val_acc: 0.9916
Epoch 6/200
 - 1s - loss: 0.0263 - acc: 0.9920 - val_loss: 0.0229 - val_acc: 0.9916
Epoch 7/200
 - 1s - loss: 0.0257 - acc: 0.9928 - val_loss: 0.0224 - val_acc: 0.9916
Epoch 8/200
 - 1s - loss: 0.0250 - acc: 0.9922 - val_loss: 0.0221 - val_acc: 0.9914
Epoch 9/200
 - 1s - loss: 0.0257 - acc: 0.9923 - val_loss: 0.0217 - val_acc: 0.9916
Epoch 10/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9918
Epoch 11/200
 - 1s - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0210 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0241 - acc: 0.9926 - val_loss: 0.0209 - val_acc: 0.9926
Epoch 13/200
 - 1s - loss: 0.0241 - acc: 0.9924 - val_loss: 0.0207 - val_acc: 0.9926
Epoch 14/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0205 - val_acc: 0.9928
Epoch 15/200
 - 1s - loss: 0.0226 - acc: 0.9932 - val_loss: 0.0204 - val_acc: 0.9928
Epoch 16/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9930
Epoch 17/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0201 - val_acc: 0.9928
Epoch 18/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0200 - val_acc: 0.9928
Epoch 19/200
 - 1s - loss: 0.0222 - acc: 0.9927 - val_loss: 0.0200 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0199 - val_acc: 0.9928
Epoch 21/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9928
Epoch 22/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9926
Epoch 26/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9926
Epoch 28/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9928
Epoch 30/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9928
Epoch 31/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9928
Epoch 32/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9928
Epoch 33/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9928
Epoch 34/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9928
Epoch 35/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9928
Epoch 36/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9930
Epoch 37/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9928
Epoch 38/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 39/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 40/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 41/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 42/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0187 - val_acc: 0.9926
Epoch 43/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 44/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9930
Epoch 45/200
 - 1s - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0186 - val_acc: 0.9930
Epoch 46/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0186 - val_acc: 0.9930
Epoch 47/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0186 - val_acc: 0.9928
Epoch 48/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0185 - val_acc: 0.9930
Epoch 49/200
 - 1s - loss: 0.0197 - acc: 0.9931 - val_loss: 0.0185 - val_acc: 0.9930
Epoch 50/200
 - 1s - loss: 0.0195 - acc: 0.9932 - val_loss: 0.0184 - val_acc: 0.9930
Epoch 51/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0185 - val_acc: 0.9928
Epoch 52/200
 - 1s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0185 - val_acc: 0.9928
Epoch 53/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0184 - val_acc: 0.9928
Epoch 54/200
 - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0185 - val_acc: 0.9926
Epoch 55/200
 - 1s - loss: 0.0191 - acc: 0.9938 - val_loss: 0.0184 - val_acc: 0.9928
Epoch 56/200
 - 1s - loss: 0.0194 - acc: 0.9936 - val_loss: 0.0184 - val_acc: 0.9928
Epoch 57/200
 - 1s - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9928
Epoch 58/200
 - 1s - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0184 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0183 - val_acc: 0.9928
Epoch 60/200
 - 1s - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0183 - val_acc: 0.9928
Epoch 61/200
 - 1s - loss: 0.0194 - acc: 0.9939 - val_loss: 0.0182 - val_acc: 0.9928
Epoch 62/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 63/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9928
Epoch 64/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0182 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0188 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9930
Epoch 66/200
 - 1s - loss: 0.0191 - acc: 0.9943 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 67/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 68/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 69/200
 - 1s - loss: 0.0181 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9928
Epoch 70/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9928
Epoch 71/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 72/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 73/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 74/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 75/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 76/200
 - 1s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0181 - val_acc: 0.9928
Epoch 77/200
 - 1s - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9928
Epoch 78/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9928
Epoch 79/200
 - 1s - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9930
Epoch 81/200
 - 1s - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9928
Epoch 82/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9930
Epoch 83/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0180 - val_acc: 0.9930
Epoch 84/200
 - 1s - loss: 0.0186 - acc: 0.9943 - val_loss: 0.0179 - val_acc: 0.9930
Epoch 85/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9928
Epoch 86/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0179 - val_acc: 0.9930
Epoch 87/200
 - 1s - loss: 0.0182 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9930
Epoch 88/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9930
Epoch 89/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9930
Epoch 90/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9930
Epoch 91/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0178 - val_acc: 0.9930
Epoch 92/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9930
Epoch 93/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0178 - val_acc: 0.9930
Epoch 94/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 95/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 96/200
 - 1s - loss: 0.0190 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 97/200
 - 1s - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 98/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 99/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 100/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 101/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 102/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 103/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9932
Epoch 104/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 105/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 106/200
 - 1s - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 107/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 108/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 109/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9930
Epoch 110/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 111/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 112/200
 - 1s - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 113/200
 - 1s - loss: 0.0180 - acc: 0.9943 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 114/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 115/200
 - 1s - loss: 0.0173 - acc: 0.9941 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 116/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 117/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 118/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 119/200
 - 1s - loss: 0.0179 - acc: 0.9948 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 120/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0177 - val_acc: 0.9928
Epoch 121/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9928
Epoch 122/200
 - 1s - loss: 0.0183 - acc: 0.9943 - val_loss: 0.0176 - val_acc: 0.9928
Epoch 123/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0176 - val_acc: 0.9928
Epoch 124/200
 - 1s - loss: 0.0176 - acc: 0.9949 - val_loss: 0.0176 - val_acc: 0.9930
Epoch 125/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9930
Epoch 126/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0176 - val_acc: 0.9932
Epoch 127/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 128/200
 - 1s - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 129/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 130/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0176 - val_acc: 0.9930
Epoch 131/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 132/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 133/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 134/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 135/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 136/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 137/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 138/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 139/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 140/200
 - 1s - loss: 0.0179 - acc: 0.9943 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 141/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 142/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 143/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 144/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 145/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 146/200
 - 1s - loss: 0.0168 - acc: 0.9943 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 147/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 148/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 149/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 150/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 151/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 152/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 153/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 154/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 155/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 156/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 157/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 158/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 159/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 160/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 161/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9932
Epoch 162/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 163/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 164/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 165/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 166/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 167/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9934
Epoch 168/200
 - 1s - loss: 0.0176 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 169/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 170/200
 - 1s - loss: 0.0173 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 171/200
 - 1s - loss: 0.0166 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 172/200
 - 1s - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 173/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 174/200
 - 1s - loss: 0.0172 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 175/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 176/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 177/200
 - 1s - loss: 0.0177 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 178/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 179/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 180/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 181/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 182/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 183/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 184/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 185/200
 - 1s - loss: 0.0175 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 186/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0173 - val_acc: 0.9934
2018-03-27 12:00:47,849 [INFO] Evaluate...
2018-03-27 12:00:52,045 [INFO] Done!
2018-03-27 12:00:52,052 [INFO] tpe_transform took 0.002474 seconds
2018-03-27 12:00:52,052 [INFO] TPE using 66/66 trials with best loss 0.011121
2018-03-27 12:00:52,060 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:00:53,048 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0810 - acc: 0.9770 - val_loss: 0.0330 - val_acc: 0.9924
Epoch 2/200
 - 1s - loss: 0.0356 - acc: 0.9897 - val_loss: 0.0269 - val_acc: 0.9928
Epoch 3/200
 - 1s - loss: 0.0306 - acc: 0.9906 - val_loss: 0.0241 - val_acc: 0.9928
Epoch 4/200
 - 1s - loss: 0.0268 - acc: 0.9920 - val_loss: 0.0227 - val_acc: 0.9932
Epoch 5/200
 - 1s - loss: 0.0247 - acc: 0.9924 - val_loss: 0.0220 - val_acc: 0.9932
Epoch 6/200
 - 1s - loss: 0.0238 - acc: 0.9921 - val_loss: 0.0214 - val_acc: 0.9934
Epoch 7/200
 - 1s - loss: 0.0220 - acc: 0.9930 - val_loss: 0.0210 - val_acc: 0.9936
Epoch 8/200
 - 1s - loss: 0.0215 - acc: 0.9928 - val_loss: 0.0207 - val_acc: 0.9944
Epoch 9/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0203 - val_acc: 0.9948
Epoch 10/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9948
Epoch 12/200
 - 1s - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0198 - val_acc: 0.9948
Epoch 13/200
 - 1s - loss: 0.0188 - acc: 0.9943 - val_loss: 0.0196 - val_acc: 0.9948
Epoch 14/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0195 - val_acc: 0.9948
Epoch 15/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0194 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 18/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9948
Epoch 21/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0156 - acc: 0.9947 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 23/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0189 - val_acc: 0.9948
Epoch 24/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 25/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 26/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 28/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0160 - acc: 0.9946 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 31/200
 - 1s - loss: 0.0153 - acc: 0.9953 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 33/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0185 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 37/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 38/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 39/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 40/200
 - 1s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 41/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 42/200
 - 1s - loss: 0.0154 - acc: 0.9952 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 43/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 44/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 45/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 47/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 48/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0143 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 51/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 52/200
 - 1s - loss: 0.0139 - acc: 0.9954 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 53/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 56/200
 - 1s - loss: 0.0150 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 57/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 59/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 60/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 61/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 62/200
 - 1s - loss: 0.0136 - acc: 0.9953 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 63/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 64/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 65/200
 - 1s - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 67/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 68/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 69/200
 - 1s - loss: 0.0144 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 70/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 72/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 73/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 74/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0181 - val_acc: 0.9948
2018-03-27 12:02:16,410 [INFO] Evaluate...
2018-03-27 12:02:20,626 [INFO] Done!
2018-03-27 12:02:20,632 [INFO] tpe_transform took 0.002565 seconds
2018-03-27 12:02:20,633 [INFO] TPE using 67/67 trials with best loss 0.011121
2018-03-27 12:02:20,641 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:02:21,631 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0478 - acc: 0.9813 - val_loss: 0.0153 - val_acc: 0.9948
Epoch 2/200
 - 1s - loss: 0.0257 - acc: 0.9915 - val_loss: 0.0147 - val_acc: 0.9948
Epoch 3/200
 - 1s - loss: 0.0231 - acc: 0.9922 - val_loss: 0.0139 - val_acc: 0.9950
Epoch 4/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0134 - val_acc: 0.9950
Epoch 5/200
 - 1s - loss: 0.0203 - acc: 0.9931 - val_loss: 0.0132 - val_acc: 0.9950
Epoch 6/200
 - 1s - loss: 0.0220 - acc: 0.9926 - val_loss: 0.0134 - val_acc: 0.9950
Epoch 7/200
 - 1s - loss: 0.0200 - acc: 0.9932 - val_loss: 0.0129 - val_acc: 0.9950
Epoch 8/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0130 - val_acc: 0.9950
Epoch 9/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0129 - val_acc: 0.9950
Epoch 10/200
 - 1s - loss: 0.0193 - acc: 0.9935 - val_loss: 0.0128 - val_acc: 0.9950
Epoch 11/200
 - 1s - loss: 0.0200 - acc: 0.9934 - val_loss: 0.0128 - val_acc: 0.9948
Epoch 12/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0128 - val_acc: 0.9948
Epoch 13/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0126 - val_acc: 0.9952
Epoch 14/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0127 - val_acc: 0.9950
Epoch 15/200
 - 1s - loss: 0.0191 - acc: 0.9939 - val_loss: 0.0127 - val_acc: 0.9950
Epoch 16/200
 - 1s - loss: 0.0183 - acc: 0.9935 - val_loss: 0.0126 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0127 - val_acc: 0.9948
Epoch 18/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0126 - val_acc: 0.9950
Epoch 19/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0127 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 21/200
 - 1s - loss: 0.0178 - acc: 0.9937 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 22/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 23/200
 - 1s - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 24/200
 - 1s - loss: 0.0185 - acc: 0.9935 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 25/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0171 - acc: 0.9940 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 28/200
 - 1s - loss: 0.0172 - acc: 0.9941 - val_loss: 0.0125 - val_acc: 0.9950
Epoch 29/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 30/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 31/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 33/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 34/200
 - 1s - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 36/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 37/200
 - 1s - loss: 0.0173 - acc: 0.9942 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 38/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 39/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 40/200
 - 1s - loss: 0.0162 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9948
Epoch 41/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 42/200
 - 1s - loss: 0.0167 - acc: 0.9953 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0173 - acc: 0.9944 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 44/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 45/200
 - 1s - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9948
Epoch 47/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0123 - val_acc: 0.9948
2018-03-27 12:03:20,754 [INFO] Evaluate...
2018-03-27 12:03:25,055 [INFO] Done!
2018-03-27 12:03:25,062 [INFO] tpe_transform took 0.003297 seconds
2018-03-27 12:03:25,063 [INFO] TPE using 68/68 trials with best loss 0.011121
2018-03-27 12:03:25,071 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:03:26,073 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0455 - acc: 0.9830 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 2/200
 - 1s - loss: 0.0206 - acc: 0.9929 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 3/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 4/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 5/200
 - 1s - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 6/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0167 - val_acc: 0.9936
Epoch 7/200
 - 1s - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0165 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0163 - val_acc: 0.9936
Epoch 10/200
 - 1s - loss: 0.0169 - acc: 0.9943 - val_loss: 0.0163 - val_acc: 0.9938
Epoch 11/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9938
Epoch 13/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0161 - val_acc: 0.9940
Epoch 14/200
 - 1s - loss: 0.0153 - acc: 0.9952 - val_loss: 0.0160 - val_acc: 0.9940
Epoch 15/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0160 - val_acc: 0.9940
Epoch 16/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0160 - val_acc: 0.9940
Epoch 17/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0159 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0159 - val_acc: 0.9940
Epoch 19/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0159 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0147 - acc: 0.9950 - val_loss: 0.0159 - val_acc: 0.9940
Epoch 21/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9940
Epoch 22/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9940
Epoch 23/200
 - 1s - loss: 0.0147 - acc: 0.9951 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0158 - val_acc: 0.9940
Epoch 25/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9940
Epoch 26/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 27/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0157 - val_acc: 0.9940
Epoch 28/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 29/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0143 - acc: 0.9959 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 32/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 34/200
 - 1s - loss: 0.0146 - acc: 0.9959 - val_loss: 0.0156 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 36/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0156 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 40/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9938
Epoch 41/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 42/200
 - 1s - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 43/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 44/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 45/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 46/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 47/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 48/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 49/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0155 - val_acc: 0.9938
Epoch 50/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 51/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 52/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 53/200
 - 1s - loss: 0.0132 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 54/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 55/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 56/200
 - 1s - loss: 0.0133 - acc: 0.9961 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 57/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 59/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 60/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 61/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 62/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 63/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 64/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 65/200
 - 1s - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 66/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 67/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 68/200
 - 1s - loss: 0.0131 - acc: 0.9962 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 69/200
 - 1s - loss: 0.0129 - acc: 0.9967 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 70/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 71/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 72/200
 - 1s - loss: 0.0136 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 73/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 74/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 75/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 76/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 77/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 78/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 79/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 80/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 81/200
 - 1s - loss: 0.0134 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 82/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 85/200
 - 1s - loss: 0.0134 - acc: 0.9957 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 86/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 87/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 88/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 89/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 90/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 91/200
 - 1s - loss: 0.0129 - acc: 0.9963 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 92/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 93/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 94/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 95/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 96/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 97/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 98/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 99/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 100/200
 - 1s - loss: 0.0130 - acc: 0.9962 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 101/200
 - 1s - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 102/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 103/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 104/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 105/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 106/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 107/200
 - 1s - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 108/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 109/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 110/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 111/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 112/200
 - 1s - loss: 0.0136 - acc: 0.9964 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 113/200
 - 1s - loss: 0.0131 - acc: 0.9957 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 114/200
 - 1s - loss: 0.0126 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 115/200
 - 1s - loss: 0.0131 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 116/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 117/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 118/200
 - 1s - loss: 0.0127 - acc: 0.9964 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 119/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 120/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 121/200
 - 1s - loss: 0.0133 - acc: 0.9955 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 122/200
 - 1s - loss: 0.0132 - acc: 0.9955 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 123/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 124/200
 - 1s - loss: 0.0133 - acc: 0.9964 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 125/200
 - 1s - loss: 0.0127 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 126/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 127/200
 - 1s - loss: 0.0129 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 128/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 129/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 130/200
 - 1s - loss: 0.0132 - acc: 0.9962 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 131/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 132/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 133/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 134/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 135/200
 - 1s - loss: 0.0128 - acc: 0.9962 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 136/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 137/200
 - 1s - loss: 0.0129 - acc: 0.9955 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 138/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 139/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 140/200
 - 1s - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 141/200
 - 1s - loss: 0.0125 - acc: 0.9962 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 142/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 143/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 144/200
 - 1s - loss: 0.0128 - acc: 0.9964 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 145/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 146/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 147/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 148/200
 - 1s - loss: 0.0127 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 149/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 150/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 151/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 152/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 153/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 154/200
 - 1s - loss: 0.0122 - acc: 0.9967 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 155/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 156/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 157/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 158/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 159/200
 - 1s - loss: 0.0128 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 160/200
 - 1s - loss: 0.0130 - acc: 0.9956 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 161/200
 - 1s - loss: 0.0125 - acc: 0.9965 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 162/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 163/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 164/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 165/200
 - 1s - loss: 0.0130 - acc: 0.9965 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 166/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 167/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 168/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 169/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 170/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 171/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 172/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 173/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 174/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 175/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 176/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 177/200
 - 1s - loss: 0.0125 - acc: 0.9966 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 178/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 179/200
 - 1s - loss: 0.0127 - acc: 0.9965 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 180/200
 - 1s - loss: 0.0131 - acc: 0.9962 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 181/200
 - 1s - loss: 0.0130 - acc: 0.9968 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 182/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 183/200
 - 1s - loss: 0.0127 - acc: 0.9962 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 184/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 185/200
 - 1s - loss: 0.0121 - acc: 0.9970 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 186/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 187/200
 - 1s - loss: 0.0129 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 188/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 189/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 190/200
 - 1s - loss: 0.0132 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 191/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 192/200
 - 1s - loss: 0.0126 - acc: 0.9964 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 193/200
 - 1s - loss: 0.0128 - acc: 0.9962 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 194/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 195/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 196/200
 - 1s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 197/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 198/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 199/200
 - 1s - loss: 0.0120 - acc: 0.9967 - val_loss: 0.0152 - val_acc: 0.9936
Epoch 200/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0152 - val_acc: 0.9936
2018-03-27 12:06:44,228 [INFO] Evaluate...
2018-03-27 12:06:48,685 [INFO] Done!
2018-03-27 12:06:48,692 [INFO] tpe_transform took 0.002599 seconds
2018-03-27 12:06:48,692 [INFO] TPE using 69/69 trials with best loss 0.011121
2018-03-27 12:06:48,699 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:06:49,698 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0496 - acc: 0.9800 - val_loss: 0.0223 - val_acc: 0.9930
Epoch 2/200
 - 1s - loss: 0.0258 - acc: 0.9919 - val_loss: 0.0197 - val_acc: 0.9926
Epoch 3/200
 - 1s - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0191 - val_acc: 0.9936
Epoch 4/200
 - 1s - loss: 0.0227 - acc: 0.9925 - val_loss: 0.0189 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0226 - acc: 0.9922 - val_loss: 0.0185 - val_acc: 0.9942
Epoch 6/200
 - 1s - loss: 0.0225 - acc: 0.9923 - val_loss: 0.0186 - val_acc: 0.9942
Epoch 7/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0186 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0218 - acc: 0.9923 - val_loss: 0.0183 - val_acc: 0.9940
Epoch 9/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 10/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0182 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0217 - acc: 0.9930 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0203 - acc: 0.9930 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 14/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0178 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0178 - val_acc: 0.9942
Epoch 16/200
 - 1s - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 17/200
 - 1s - loss: 0.0196 - acc: 0.9933 - val_loss: 0.0177 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0176 - val_acc: 0.9940
Epoch 19/200
 - 1s - loss: 0.0200 - acc: 0.9929 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 20/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 21/200
 - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 22/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 23/200
 - 1s - loss: 0.0196 - acc: 0.9932 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 24/200
 - 1s - loss: 0.0200 - acc: 0.9936 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 25/200
 - 1s - loss: 0.0203 - acc: 0.9931 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 26/200
 - 1s - loss: 0.0198 - acc: 0.9932 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 27/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 28/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0188 - acc: 0.9936 - val_loss: 0.0174 - val_acc: 0.9942
Epoch 30/200
 - 1s - loss: 0.0195 - acc: 0.9932 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 31/200
 - 1s - loss: 0.0185 - acc: 0.9936 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 33/200
 - 1s - loss: 0.0193 - acc: 0.9942 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 34/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 35/200
 - 1s - loss: 0.0189 - acc: 0.9934 - val_loss: 0.0173 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0197 - acc: 0.9932 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 37/200
 - 1s - loss: 0.0193 - acc: 0.9931 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 38/200
 - 1s - loss: 0.0199 - acc: 0.9929 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 39/200
 - 1s - loss: 0.0180 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9942
Epoch 40/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 41/200
 - 1s - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 42/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 43/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 44/200
 - 1s - loss: 0.0196 - acc: 0.9930 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 45/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 46/200
 - 1s - loss: 0.0194 - acc: 0.9931 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 47/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0171 - val_acc: 0.9942
Epoch 48/200
 - 1s - loss: 0.0186 - acc: 0.9935 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 49/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 51/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 52/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 53/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0185 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 56/200
 - 1s - loss: 0.0191 - acc: 0.9935 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 57/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 58/200
 - 1s - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 59/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 60/200
 - 1s - loss: 0.0185 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 61/200
 - 1s - loss: 0.0192 - acc: 0.9936 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0183 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 64/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0181 - acc: 0.9932 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0190 - acc: 0.9933 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 67/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 68/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 69/200
 - 1s - loss: 0.0188 - acc: 0.9941 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 70/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 71/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 72/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 73/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 74/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 75/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 76/200
 - 1s - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 77/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 78/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 79/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 80/200
 - 1s - loss: 0.0178 - acc: 0.9941 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 81/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 82/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 83/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 84/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 85/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 86/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 87/200
 - 1s - loss: 0.0185 - acc: 0.9937 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 88/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 89/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 90/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 91/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 92/200
 - 1s - loss: 0.0188 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 93/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 94/200
 - 1s - loss: 0.0184 - acc: 0.9935 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 95/200
 - 1s - loss: 0.0181 - acc: 0.9939 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 96/200
 - 1s - loss: 0.0183 - acc: 0.9939 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 97/200
 - 1s - loss: 0.0187 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 98/200
 - 1s - loss: 0.0187 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 99/200
 - 1s - loss: 0.0177 - acc: 0.9943 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 100/200
 - 1s - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 101/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 102/200
 - 1s - loss: 0.0173 - acc: 0.9943 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 103/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 104/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 105/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 106/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0168 - val_acc: 0.9940
2018-03-27 12:08:42,890 [INFO] Evaluate...
2018-03-27 12:08:47,276 [INFO] Done!
2018-03-27 12:08:47,282 [INFO] tpe_transform took 0.002467 seconds
2018-03-27 12:08:47,283 [INFO] TPE using 70/70 trials with best loss 0.011121
2018-03-27 12:08:47,291 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:08:48,280 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0508 - acc: 0.9796 - val_loss: 0.0185 - val_acc: 0.9932
Epoch 2/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0173 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0168 - val_acc: 0.9934
Epoch 4/200
 - 1s - loss: 0.0224 - acc: 0.9926 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 5/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 6/200
 - 1s - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0161 - val_acc: 0.9938
Epoch 7/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0164 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0191 - acc: 0.9932 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 9/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 10/200
 - 1s - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0158 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.0159 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9942
Epoch 13/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 14/200
 - 1s - loss: 0.0183 - acc: 0.9940 - val_loss: 0.0158 - val_acc: 0.9942
Epoch 15/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 16/200
 - 1s - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0159 - val_acc: 0.9942
Epoch 17/200
 - 1s - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 18/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 19/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 20/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 21/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 22/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0158 - val_acc: 0.9942
Epoch 23/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 24/200
 - 1s - loss: 0.0173 - acc: 0.9938 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 25/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9942
Epoch 27/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0188 - acc: 0.9934 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0155 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0165 - acc: 0.9942 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0154 - acc: 0.9955 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0157 - acc: 0.9947 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0176 - acc: 0.9940 - val_loss: 0.0156 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0169 - acc: 0.9943 - val_loss: 0.0157 - val_acc: 0.9946
Epoch 37/200
 - 1s - loss: 0.0167 - acc: 0.9945 - val_loss: 0.0156 - val_acc: 0.9944
2018-03-27 12:09:38,901 [INFO] Evaluate...
2018-03-27 12:09:43,305 [INFO] Done!
2018-03-27 12:09:43,312 [INFO] tpe_transform took 0.003320 seconds
2018-03-27 12:09:43,313 [INFO] TPE using 71/71 trials with best loss 0.011121
2018-03-27 12:09:43,320 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:09:44,311 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0405 - acc: 0.9850 - val_loss: 0.0243 - val_acc: 0.9910
Epoch 2/200
 - 1s - loss: 0.0206 - acc: 0.9933 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 3/200
 - 1s - loss: 0.0196 - acc: 0.9935 - val_loss: 0.0221 - val_acc: 0.9926
Epoch 4/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0218 - val_acc: 0.9916
Epoch 5/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0216 - val_acc: 0.9920
Epoch 6/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0214 - val_acc: 0.9920
Epoch 7/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0213 - val_acc: 0.9922
Epoch 8/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0212 - val_acc: 0.9926
Epoch 9/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0211 - val_acc: 0.9928
Epoch 10/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0211 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0210 - val_acc: 0.9934
Epoch 12/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0209 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0209 - val_acc: 0.9930
Epoch 14/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 15/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0208 - val_acc: 0.9932
Epoch 16/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 17/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 18/200
 - 1s - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 19/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 20/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 21/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 22/200
 - 1s - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 23/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 24/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 25/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 26/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 27/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0154 - acc: 0.9948 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 29/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 30/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9936
Epoch 31/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 32/200
 - 1s - loss: 0.0154 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 34/200
 - 1s - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 36/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0151 - acc: 0.9951 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 41/200
 - 1s - loss: 0.0151 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 42/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 43/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 45/200
 - 1s - loss: 0.0152 - acc: 0.9949 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 46/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 47/200
 - 1s - loss: 0.0149 - acc: 0.9952 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 48/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 49/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 50/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 51/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 52/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 53/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 54/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 55/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 56/200
 - 1s - loss: 0.0146 - acc: 0.9952 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 57/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 58/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 59/200
 - 1s - loss: 0.0147 - acc: 0.9953 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 60/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 61/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 62/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 63/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 64/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 65/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 66/200
 - 1s - loss: 0.0146 - acc: 0.9954 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 67/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 68/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 69/200
 - 1s - loss: 0.0150 - acc: 0.9952 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 70/200
 - 1s - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 71/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 72/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 73/200
 - 1s - loss: 0.0144 - acc: 0.9952 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 74/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 75/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 76/200
 - 1s - loss: 0.0144 - acc: 0.9950 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 77/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 78/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 79/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 80/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 81/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 82/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 83/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 84/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 85/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 86/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 87/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 88/200
 - 1s - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 89/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 90/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 91/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 92/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 93/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 94/200
 - 1s - loss: 0.0138 - acc: 0.9951 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 95/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 96/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 97/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 98/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 99/200
 - 1s - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 100/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 101/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 102/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 103/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 104/200
 - 1s - loss: 0.0139 - acc: 0.9954 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 105/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 106/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 107/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 108/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 109/200
 - 1s - loss: 0.0141 - acc: 0.9957 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 110/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 111/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 112/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 113/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 114/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 115/200
 - 1s - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 116/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 117/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 118/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 119/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 120/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 121/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 122/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0200 - val_acc: 0.9934
Epoch 123/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 124/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 125/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 126/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 127/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 128/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 129/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 130/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 131/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9934
Epoch 132/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 133/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 134/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 135/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 136/200
 - 1s - loss: 0.0141 - acc: 0.9953 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 137/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 138/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 139/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 140/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 142/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0142 - acc: 0.9957 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 145/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 147/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 148/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 149/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 150/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 151/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 152/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 153/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 154/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 155/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 156/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 157/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 158/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 159/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 160/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 161/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 162/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 163/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 164/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 165/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 166/200
 - 1s - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 167/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 168/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 169/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 170/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 171/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 172/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 173/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 174/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 175/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0198 - val_acc: 0.9938
Epoch 177/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 178/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 179/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 180/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 181/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 182/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 183/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 184/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 185/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 186/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 187/200
 - 1s - loss: 0.0139 - acc: 0.9953 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 188/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 189/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 190/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 191/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 192/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 193/200
 - 1s - loss: 0.0137 - acc: 0.9959 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 194/200
 - 1s - loss: 0.0135 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 195/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 196/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 197/200
 - 1s - loss: 0.0136 - acc: 0.9957 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 198/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 199/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 200/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0198 - val_acc: 0.9936
2018-03-27 12:13:04,248 [INFO] Evaluate...
2018-03-27 12:13:08,742 [INFO] Done!
2018-03-27 12:13:08,749 [INFO] tpe_transform took 0.002529 seconds
2018-03-27 12:13:08,750 [INFO] TPE using 72/72 trials with best loss 0.011121
2018-03-27 12:13:08,757 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:13:09,742 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0509 - acc: 0.9793 - val_loss: 0.0186 - val_acc: 0.9942
Epoch 2/200
 - 1s - loss: 0.0233 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9938
Epoch 3/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0162 - val_acc: 0.9940
Epoch 4/200
 - 1s - loss: 0.0204 - acc: 0.9926 - val_loss: 0.0155 - val_acc: 0.9940
Epoch 5/200
 - 1s - loss: 0.0192 - acc: 0.9934 - val_loss: 0.0151 - val_acc: 0.9942
Epoch 6/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0155 - val_acc: 0.9940
Epoch 7/200
 - 1s - loss: 0.0193 - acc: 0.9930 - val_loss: 0.0147 - val_acc: 0.9942
Epoch 8/200
 - 1s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0178 - acc: 0.9939 - val_loss: 0.0149 - val_acc: 0.9940
Epoch 10/200
 - 1s - loss: 0.0177 - acc: 0.9939 - val_loss: 0.0146 - val_acc: 0.9942
Epoch 11/200
 - 1s - loss: 0.0182 - acc: 0.9935 - val_loss: 0.0145 - val_acc: 0.9944
Epoch 12/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0146 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0145 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0171 - acc: 0.9942 - val_loss: 0.0148 - val_acc: 0.9940
Epoch 15/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0147 - val_acc: 0.9940
Epoch 16/200
 - 1s - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0144 - val_acc: 0.9944
Epoch 17/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0144 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0144 - val_acc: 0.9944
Epoch 19/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0145 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0144 - val_acc: 0.9944
Epoch 21/200
 - 1s - loss: 0.0155 - acc: 0.9951 - val_loss: 0.0143 - val_acc: 0.9944
Epoch 22/200
 - 1s - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0143 - val_acc: 0.9944
Epoch 23/200
 - 1s - loss: 0.0165 - acc: 0.9944 - val_loss: 0.0143 - val_acc: 0.9944
Epoch 24/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0142 - val_acc: 0.9944
Epoch 25/200
 - 1s - loss: 0.0158 - acc: 0.9948 - val_loss: 0.0141 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0152 - acc: 0.9948 - val_loss: 0.0142 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0160 - acc: 0.9942 - val_loss: 0.0140 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0142 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0143 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0159 - acc: 0.9948 - val_loss: 0.0142 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0141 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0141 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0156 - acc: 0.9948 - val_loss: 0.0140 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0158 - acc: 0.9943 - val_loss: 0.0140 - val_acc: 0.9946
Epoch 35/200
 - 1s - loss: 0.0154 - acc: 0.9948 - val_loss: 0.0141 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0144 - acc: 0.9949 - val_loss: 0.0140 - val_acc: 0.9946
Epoch 37/200
 - 1s - loss: 0.0150 - acc: 0.9945 - val_loss: 0.0140 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0156 - acc: 0.9949 - val_loss: 0.0140 - val_acc: 0.9946
Epoch 39/200
 - 1s - loss: 0.0150 - acc: 0.9945 - val_loss: 0.0139 - val_acc: 0.9946
Epoch 40/200
 - 1s - loss: 0.0154 - acc: 0.9950 - val_loss: 0.0139 - val_acc: 0.9946
Epoch 41/200
 - 1s - loss: 0.0161 - acc: 0.9945 - val_loss: 0.0139 - val_acc: 0.9946
Epoch 42/200
 - 1s - loss: 0.0152 - acc: 0.9950 - val_loss: 0.0139 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0141 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0153 - acc: 0.9951 - val_loss: 0.0141 - val_acc: 0.9948
Epoch 45/200
 - 1s - loss: 0.0153 - acc: 0.9947 - val_loss: 0.0141 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0139 - acc: 0.9954 - val_loss: 0.0140 - val_acc: 0.9946
Epoch 47/200
 - 1s - loss: 0.0150 - acc: 0.9952 - val_loss: 0.0141 - val_acc: 0.9946
Epoch 48/200
 - 1s - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0140 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0140 - acc: 0.9952 - val_loss: 0.0141 - val_acc: 0.9948
Epoch 50/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0141 - val_acc: 0.9948
2018-03-27 12:14:13,220 [INFO] Evaluate...
2018-03-27 12:14:17,689 [INFO] Done!
2018-03-27 12:14:17,696 [INFO] tpe_transform took 0.002618 seconds
2018-03-27 12:14:17,697 [INFO] TPE using 73/73 trials with best loss 0.011121
2018-03-27 12:14:17,705 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:14:18,696 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0492 - acc: 0.9808 - val_loss: 0.0208 - val_acc: 0.9938
Epoch 2/200
 - 1s - loss: 0.0250 - acc: 0.9909 - val_loss: 0.0179 - val_acc: 0.9956
Epoch 3/200
 - 1s - loss: 0.0238 - acc: 0.9917 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 4/200
 - 1s - loss: 0.0221 - acc: 0.9924 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 5/200
 - 1s - loss: 0.0212 - acc: 0.9927 - val_loss: 0.0180 - val_acc: 0.9954
Epoch 6/200
 - 1s - loss: 0.0210 - acc: 0.9928 - val_loss: 0.0179 - val_acc: 0.9954
Epoch 7/200
 - 1s - loss: 0.0216 - acc: 0.9920 - val_loss: 0.0177 - val_acc: 0.9954
Epoch 8/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0179 - val_acc: 0.9952
Epoch 9/200
 - 1s - loss: 0.0199 - acc: 0.9929 - val_loss: 0.0178 - val_acc: 0.9952
Epoch 10/200
 - 1s - loss: 0.0188 - acc: 0.9925 - val_loss: 0.0175 - val_acc: 0.9958
Epoch 11/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0173 - val_acc: 0.9958
Epoch 12/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9960
Epoch 13/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9958
Epoch 14/200
 - 1s - loss: 0.0191 - acc: 0.9936 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 15/200
 - 1s - loss: 0.0187 - acc: 0.9931 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 16/200
 - 1s - loss: 0.0192 - acc: 0.9929 - val_loss: 0.0168 - val_acc: 0.9962
Epoch 17/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 18/200
 - 1s - loss: 0.0195 - acc: 0.9934 - val_loss: 0.0171 - val_acc: 0.9958
Epoch 19/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 20/200
 - 1s - loss: 0.0197 - acc: 0.9928 - val_loss: 0.0171 - val_acc: 0.9958
Epoch 21/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 22/200
 - 1s - loss: 0.0181 - acc: 0.9930 - val_loss: 0.0169 - val_acc: 0.9962
Epoch 23/200
 - 1s - loss: 0.0189 - acc: 0.9931 - val_loss: 0.0170 - val_acc: 0.9958
Epoch 24/200
 - 1s - loss: 0.0179 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9956
2018-03-27 12:14:58,377 [INFO] Evaluate...
2018-03-27 12:15:03,009 [INFO] Done!
2018-03-27 12:15:03,016 [INFO] tpe_transform took 0.002413 seconds
2018-03-27 12:15:03,017 [INFO] TPE using 74/74 trials with best loss 0.011121
2018-03-27 12:15:03,024 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:15:04,026 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0409 - acc: 0.9840 - val_loss: 0.0243 - val_acc: 0.9920
Epoch 2/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0240 - val_acc: 0.9912
Epoch 3/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0232 - val_acc: 0.9914
Epoch 4/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 5/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0224 - val_acc: 0.9924
Epoch 6/200
 - 1s - loss: 0.0122 - acc: 0.9967 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 7/200
 - 1s - loss: 0.0124 - acc: 0.9967 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 8/200
 - 1s - loss: 0.0114 - acc: 0.9966 - val_loss: 0.0226 - val_acc: 0.9914
Epoch 9/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 10/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0220 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0104 - acc: 0.9974 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 12/200
 - 1s - loss: 0.0110 - acc: 0.9969 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 13/200
 - 1s - loss: 0.0107 - acc: 0.9972 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 14/200
 - 1s - loss: 0.0104 - acc: 0.9973 - val_loss: 0.0220 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0105 - acc: 0.9973 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 16/200
 - 1s - loss: 0.0104 - acc: 0.9968 - val_loss: 0.0219 - val_acc: 0.9926
Epoch 17/200
 - 1s - loss: 0.0101 - acc: 0.9970 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 18/200
 - 1s - loss: 0.0104 - acc: 0.9972 - val_loss: 0.0219 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0098 - acc: 0.9973 - val_loss: 0.0218 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0099 - acc: 0.9974 - val_loss: 0.0218 - val_acc: 0.9926
Epoch 21/200
 - 1s - loss: 0.0099 - acc: 0.9972 - val_loss: 0.0220 - val_acc: 0.9918
Epoch 22/200
 - 1s - loss: 0.0098 - acc: 0.9971 - val_loss: 0.0219 - val_acc: 0.9918
Epoch 23/200
 - 1s - loss: 0.0097 - acc: 0.9971 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0099 - acc: 0.9970 - val_loss: 0.0218 - val_acc: 0.9920
Epoch 25/200
 - 1s - loss: 0.0096 - acc: 0.9973 - val_loss: 0.0218 - val_acc: 0.9920
Epoch 26/200
 - 1s - loss: 0.0099 - acc: 0.9973 - val_loss: 0.0218 - val_acc: 0.9920
Epoch 27/200
 - 1s - loss: 0.0096 - acc: 0.9976 - val_loss: 0.0218 - val_acc: 0.9920
Epoch 28/200
 - 1s - loss: 0.0091 - acc: 0.9973 - val_loss: 0.0218 - val_acc: 0.9918
Epoch 29/200
 - 1s - loss: 0.0098 - acc: 0.9971 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 30/200
 - 1s - loss: 0.0100 - acc: 0.9971 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 31/200
 - 1s - loss: 0.0101 - acc: 0.9971 - val_loss: 0.0218 - val_acc: 0.9920
Epoch 32/200
 - 1s - loss: 0.0096 - acc: 0.9972 - val_loss: 0.0217 - val_acc: 0.9920
Epoch 33/200
 - 1s - loss: 0.0095 - acc: 0.9972 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 34/200
 - 1s - loss: 0.0096 - acc: 0.9974 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 35/200
 - 1s - loss: 0.0096 - acc: 0.9974 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0090 - acc: 0.9974 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 37/200
 - 1s - loss: 0.0094 - acc: 0.9977 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 38/200
 - 1s - loss: 0.0093 - acc: 0.9977 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 39/200
 - 1s - loss: 0.0088 - acc: 0.9977 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 40/200
 - 1s - loss: 0.0091 - acc: 0.9976 - val_loss: 0.0217 - val_acc: 0.9924
Epoch 41/200
 - 1s - loss: 0.0088 - acc: 0.9978 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 42/200
 - 1s - loss: 0.0093 - acc: 0.9975 - val_loss: 0.0218 - val_acc: 0.9918
2018-03-27 12:16:00,251 [INFO] Evaluate...
2018-03-27 12:16:04,940 [INFO] Done!
2018-03-27 12:16:04,947 [INFO] tpe_transform took 0.002606 seconds
2018-03-27 12:16:04,947 [INFO] TPE using 75/75 trials with best loss 0.011121
2018-03-27 12:16:04,955 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:16:05,955 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 7s - loss: 0.0425 - acc: 0.9829 - val_loss: 0.0224 - val_acc: 0.9936
Epoch 2/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 3/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0200 - val_acc: 0.9930
Epoch 4/200
 - 1s - loss: 0.0180 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 5/200
 - 1s - loss: 0.0169 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 6/200
 - 1s - loss: 0.0167 - acc: 0.9946 - val_loss: 0.0189 - val_acc: 0.9940
Epoch 7/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 9/200
 - 1s - loss: 0.0153 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 11/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 12/200
 - 1s - loss: 0.0152 - acc: 0.9948 - val_loss: 0.0180 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0148 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0143 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0179 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0150 - acc: 0.9952 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 17/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0178 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 20/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 23/200
 - 1s - loss: 0.0140 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 24/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 25/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 26/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 27/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 28/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 31/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 32/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 33/200
 - 1s - loss: 0.0140 - acc: 0.9950 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 35/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 37/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 38/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 39/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 41/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0132 - acc: 0.9951 - val_loss: 0.0175 - val_acc: 0.9946
 - 1s - loss: 0.0123 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0122 - acc: 0.9957 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 47/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 48/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 49/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 50/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 51/200
 - 1s - loss: 0.0139 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 52/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 53/200
 - 1s - loss: 0.0133 - acc: 0.9957 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 54/200
 - 1s - loss: 0.0126 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 55/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 56/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0174 - val_acc: 0.9946
Epoch 57/200
 - 1s - loss: 0.0125 - acc: 0.9965 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 58/200
 - 1s - loss: 0.0126 - acc: 0.9962 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 59/200
 - 1s - loss: 0.0126 - acc: 0.9965 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 60/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 61/200
 - 1s - loss: 0.0129 - acc: 0.9957 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 62/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 63/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 64/200
 - 1s - loss: 0.0128 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 65/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 66/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 67/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 68/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 70/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 72/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 73/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 74/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 75/200
 - 1s - loss: 0.0127 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9946
Epoch 76/200
 - 1s - loss: 0.0124 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0122 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 78/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0126 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 81/200
 - 1s - loss: 0.0130 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0126 - acc: 0.9963 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 84/200
 - 1s - loss: 0.0127 - acc: 0.9967 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 85/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 86/200
 - 1s - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 87/200
 - 1s - loss: 0.0122 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 88/200
 - 1s - loss: 0.0119 - acc: 0.9968 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 89/200
 - 1s - loss: 0.0124 - acc: 0.9965 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 90/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 91/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 92/200
 - 1s - loss: 0.0128 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 93/200
 - 1s - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 94/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 95/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 96/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 97/200
 - 1s - loss: 0.0122 - acc: 0.9960 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 98/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 99/200
 - 1s - loss: 0.0121 - acc: 0.9965 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 100/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 101/200
 - 1s - loss: 0.0127 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 102/200
 - 1s - loss: 0.0123 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 103/200
 - 1s - loss: 0.0121 - acc: 0.9961 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 104/200
 - 1s - loss: 0.0121 - acc: 0.9961 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 105/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 106/200
 - 1s - loss: 0.0120 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 107/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 108/200
 - 1s - loss: 0.0126 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 109/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 110/200
 - 1s - loss: 0.0127 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 111/200
 - 1s - loss: 0.0120 - acc: 0.9962 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 112/200
 - 1s - loss: 0.0127 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 113/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 114/200
 - 1s - loss: 0.0121 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 115/200
 - 1s - loss: 0.0126 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 116/200
 - 1s - loss: 0.0128 - acc: 0.9964 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 117/200
 - 1s - loss: 0.0124 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 118/200
 - 1s - loss: 0.0129 - acc: 0.9962 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 119/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 120/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 121/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 122/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 123/200
 - 1s - loss: 0.0124 - acc: 0.9962 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 124/200
 - 1s - loss: 0.0121 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 125/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 126/200
 - 1s - loss: 0.0117 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 127/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 128/200
 - 1s - loss: 0.0125 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 129/200
 - 1s - loss: 0.0120 - acc: 0.9964 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 130/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 131/200
 - 1s - loss: 0.0118 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 132/200
 - 1s - loss: 0.0119 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 133/200
 - 1s - loss: 0.0112 - acc: 0.9967 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 134/200
 - 1s - loss: 0.0125 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 135/200
 - 1s - loss: 0.0122 - acc: 0.9962 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 136/200
 - 1s - loss: 0.0114 - acc: 0.9969 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 137/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 138/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 139/200
 - 1s - loss: 0.0127 - acc: 0.9957 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 140/200
 - 1s - loss: 0.0123 - acc: 0.9965 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 141/200
 - 1s - loss: 0.0114 - acc: 0.9964 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 142/200
 - 1s - loss: 0.0120 - acc: 0.9966 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 143/200
 - 1s - loss: 0.0124 - acc: 0.9965 - val_loss: 0.0171 - val_acc: 0.9946
2018-03-27 12:18:34,166 [INFO] Evaluate...
2018-03-27 12:18:38,898 [INFO] Done!
2018-03-27 12:18:38,905 [INFO] tpe_transform took 0.002526 seconds
2018-03-27 12:18:38,906 [INFO] TPE using 76/76 trials with best loss 0.011121
2018-03-27 12:18:38,916 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:18:39,928 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0542 - acc: 0.9797 - val_loss: 0.0265 - val_acc: 0.9910
Epoch 2/200
 - 1s - loss: 0.0276 - acc: 0.9906 - val_loss: 0.0251 - val_acc: 0.9918
Epoch 3/200
 - 1s - loss: 0.0264 - acc: 0.9906 - val_loss: 0.0244 - val_acc: 0.9918
Epoch 4/200
 - 1s - loss: 0.0254 - acc: 0.9917 - val_loss: 0.0240 - val_acc: 0.9918
Epoch 5/200
 - 1s - loss: 0.0236 - acc: 0.9920 - val_loss: 0.0237 - val_acc: 0.9922
Epoch 6/200
 - 1s - loss: 0.0244 - acc: 0.9923 - val_loss: 0.0236 - val_acc: 0.9916
Epoch 7/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0235 - val_acc: 0.9916
Epoch 8/200
 - 1s - loss: 0.0241 - acc: 0.9918 - val_loss: 0.0233 - val_acc: 0.9916
Epoch 9/200
 - 1s - loss: 0.0230 - acc: 0.9924 - val_loss: 0.0232 - val_acc: 0.9916
Epoch 10/200
 - 1s - loss: 0.0238 - acc: 0.9922 - val_loss: 0.0231 - val_acc: 0.9918
Epoch 11/200
 - 1s - loss: 0.0251 - acc: 0.9913 - val_loss: 0.0231 - val_acc: 0.9916
Epoch 12/200
 - 1s - loss: 0.0220 - acc: 0.9928 - val_loss: 0.0230 - val_acc: 0.9916
Epoch 13/200
 - 1s - loss: 0.0238 - acc: 0.9922 - val_loss: 0.0229 - val_acc: 0.9918
Epoch 14/200
 - 1s - loss: 0.0236 - acc: 0.9922 - val_loss: 0.0228 - val_acc: 0.9918
Epoch 15/200
 - 1s - loss: 0.0234 - acc: 0.9921 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 16/200
 - 1s - loss: 0.0231 - acc: 0.9922 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 17/200
 - 1s - loss: 0.0220 - acc: 0.9924 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 18/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0227 - val_acc: 0.9916
Epoch 19/200
 - 1s - loss: 0.0222 - acc: 0.9927 - val_loss: 0.0226 - val_acc: 0.9916
Epoch 20/200
 - 1s - loss: 0.0226 - acc: 0.9923 - val_loss: 0.0226 - val_acc: 0.9918
Epoch 21/200
 - 1s - loss: 0.0217 - acc: 0.9930 - val_loss: 0.0225 - val_acc: 0.9918
Epoch 22/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0225 - val_acc: 0.9918
Epoch 23/200
 - 1s - loss: 0.0220 - acc: 0.9920 - val_loss: 0.0225 - val_acc: 0.9918
Epoch 24/200
 - 1s - loss: 0.0223 - acc: 0.9921 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 25/200
 - 1s - loss: 0.0221 - acc: 0.9923 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 26/200
 - 1s - loss: 0.0213 - acc: 0.9930 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 27/200
 - 1s - loss: 0.0219 - acc: 0.9926 - val_loss: 0.0224 - val_acc: 0.9918
Epoch 28/200
 - 1s - loss: 0.0220 - acc: 0.9928 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 29/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 30/200
 - 1s - loss: 0.0222 - acc: 0.9924 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 31/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 32/200
 - 1s - loss: 0.0217 - acc: 0.9927 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 33/200
 - 1s - loss: 0.0225 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 34/200
 - 1s - loss: 0.0218 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 35/200
 - 1s - loss: 0.0212 - acc: 0.9924 - val_loss: 0.0222 - val_acc: 0.9918
Epoch 36/200
 - 1s - loss: 0.0210 - acc: 0.9929 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 37/200
 - 1s - loss: 0.0216 - acc: 0.9927 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 38/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 39/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 40/200
 - 1s - loss: 0.0213 - acc: 0.9928 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 41/200
 - 1s - loss: 0.0216 - acc: 0.9924 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 42/200
 - 1s - loss: 0.0208 - acc: 0.9930 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 43/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 44/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 45/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 46/200
 - 1s - loss: 0.0213 - acc: 0.9930 - val_loss: 0.0221 - val_acc: 0.9920
Epoch 47/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 48/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 49/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 50/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 51/200
 - 1s - loss: 0.0210 - acc: 0.9923 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 52/200
 - 1s - loss: 0.0214 - acc: 0.9928 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 53/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 54/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0220 - val_acc: 0.9920
Epoch 55/200
 - 1s - loss: 0.0210 - acc: 0.9928 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 56/200
 - 1s - loss: 0.0202 - acc: 0.9928 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 57/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 58/200
 - 1s - loss: 0.0209 - acc: 0.9927 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 59/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 60/200
 - 1s - loss: 0.0211 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 61/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 62/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 63/200
 - 1s - loss: 0.0212 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 64/200
 - 1s - loss: 0.0210 - acc: 0.9926 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 65/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 66/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 67/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 68/200
 - 1s - loss: 0.0209 - acc: 0.9929 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 69/200
 - 1s - loss: 0.0208 - acc: 0.9927 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 70/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0219 - val_acc: 0.9920
Epoch 71/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 72/200
 - 1s - loss: 0.0211 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 73/200
 - 1s - loss: 0.0215 - acc: 0.9924 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 74/200
 - 1s - loss: 0.0208 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 75/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 76/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 77/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 78/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 79/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 80/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 81/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 82/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 83/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 84/200
 - 1s - loss: 0.0196 - acc: 0.9933 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 85/200
 - 1s - loss: 0.0223 - acc: 0.9927 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 86/200
 - 1s - loss: 0.0210 - acc: 0.9930 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 87/200
 - 1s - loss: 0.0212 - acc: 0.9922 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 88/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 89/200
 - 1s - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 90/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 91/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 92/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 93/200
 - 1s - loss: 0.0209 - acc: 0.9926 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 94/200
 - 1s - loss: 0.0207 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 95/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 96/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 97/200
 - 1s - loss: 0.0207 - acc: 0.9928 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 98/200
 - 1s - loss: 0.0204 - acc: 0.9928 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 99/200
 - 1s - loss: 0.0212 - acc: 0.9924 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 100/200
 - 1s - loss: 0.0203 - acc: 0.9928 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 101/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 102/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 103/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 104/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 105/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 106/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 107/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0217 - val_acc: 0.9922
Epoch 108/200
 - 1s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 109/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 110/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 111/200
 - 1s - loss: 0.0204 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 112/200
 - 1s - loss: 0.0199 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 113/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 114/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 115/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 116/200
 - 1s - loss: 0.0191 - acc: 0.9941 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 117/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 118/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 119/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 120/200
 - 1s - loss: 0.0205 - acc: 0.9934 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 121/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 122/200
 - 1s - loss: 0.0199 - acc: 0.9931 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 123/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 124/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 125/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 126/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9922
Epoch 127/200
 - 1s - loss: 0.0205 - acc: 0.9927 - val_loss: 0.0216 - val_acc: 0.9922
2018-03-27 12:20:54,174 [INFO] Evaluate...
2018-03-27 12:20:58,861 [INFO] Done!
2018-03-27 12:20:58,868 [INFO] tpe_transform took 0.002549 seconds
2018-03-27 12:20:58,869 [INFO] TPE using 77/77 trials with best loss 0.011121
2018-03-27 12:20:58,876 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:20:59,878 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0400 - acc: 0.9845 - val_loss: 0.0211 - val_acc: 0.9926
Epoch 2/200
 - 1s - loss: 0.0191 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9936
Epoch 3/200
 - 1s - loss: 0.0173 - acc: 0.9940 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 4/200
 - 1s - loss: 0.0164 - acc: 0.9945 - val_loss: 0.0179 - val_acc: 0.9950
Epoch 5/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 6/200
 - 1s - loss: 0.0159 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 7/200
 - 1s - loss: 0.0153 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9952
Epoch 8/200
 - 1s - loss: 0.0149 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 9/200
 - 1s - loss: 0.0148 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9950
Epoch 10/200
 - 1s - loss: 0.0142 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9950
Epoch 11/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0172 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0142 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 13/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9952
Epoch 14/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 15/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0170 - val_acc: 0.9950
Epoch 16/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9950
Epoch 17/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0169 - val_acc: 0.9952
Epoch 18/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 19/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 20/200
 - 1s - loss: 0.0137 - acc: 0.9953 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 21/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0168 - val_acc: 0.9952
Epoch 22/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 23/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 24/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0167 - val_acc: 0.9954
Epoch 25/200
 - 1s - loss: 0.0132 - acc: 0.9954 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 26/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 28/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 29/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0167 - val_acc: 0.9952
Epoch 30/200
 - 1s - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0166 - val_acc: 0.9954
Epoch 31/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0166 - val_acc: 0.9954
Epoch 32/200
 - 1s - loss: 0.0134 - acc: 0.9956 - val_loss: 0.0166 - val_acc: 0.9954
Epoch 33/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0166 - val_acc: 0.9954
Epoch 34/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0166 - val_acc: 0.9954
Epoch 35/200
 - 1s - loss: 0.0131 - acc: 0.9954 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 36/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 37/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 38/200
 - 1s - loss: 0.0128 - acc: 0.9956 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 39/200
 - 1s - loss: 0.0129 - acc: 0.9955 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 40/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0166 - val_acc: 0.9952
Epoch 41/200
 - 1s - loss: 0.0130 - acc: 0.9957 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 42/200
 - 1s - loss: 0.0129 - acc: 0.9957 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 43/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 44/200
 - 1s - loss: 0.0128 - acc: 0.9958 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 45/200
 - 1s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 46/200
 - 1s - loss: 0.0128 - acc: 0.9959 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 47/200
 - 1s - loss: 0.0127 - acc: 0.9962 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 48/200
 - 1s - loss: 0.0130 - acc: 0.9956 - val_loss: 0.0165 - val_acc: 0.9952
Epoch 49/200
 - 1s - loss: 0.0125 - acc: 0.9962 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 50/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 51/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 52/200
 - 1s - loss: 0.0125 - acc: 0.9964 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 53/200
 - 1s - loss: 0.0126 - acc: 0.9961 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 54/200
 - 1s - loss: 0.0127 - acc: 0.9960 - val_loss: 0.0165 - val_acc: 0.9950
Epoch 55/200
 - 1s - loss: 0.0128 - acc: 0.9957 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 56/200
 - 1s - loss: 0.0127 - acc: 0.9959 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 57/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 58/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 59/200
 - 1s - loss: 0.0124 - acc: 0.9958 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 60/200
 - 1s - loss: 0.0129 - acc: 0.9957 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 61/200
 - 1s - loss: 0.0123 - acc: 0.9962 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 62/200
 - 1s - loss: 0.0130 - acc: 0.9956 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 63/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 64/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 65/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 66/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 67/200
 - 1s - loss: 0.0126 - acc: 0.9962 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 68/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 69/200
 - 1s - loss: 0.0124 - acc: 0.9962 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 70/200
 - 1s - loss: 0.0124 - acc: 0.9958 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 71/200
 - 1s - loss: 0.0122 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 72/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 73/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 74/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 75/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 76/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0164 - val_acc: 0.9950
Epoch 77/200
 - 1s - loss: 0.0125 - acc: 0.9959 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 78/200
 - 1s - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 79/200
 - 1s - loss: 0.0124 - acc: 0.9959 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 80/200
 - 1s - loss: 0.0124 - acc: 0.9959 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 81/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 82/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 83/200
 - 1s - loss: 0.0121 - acc: 0.9961 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 84/200
 - 1s - loss: 0.0124 - acc: 0.9960 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 85/200
 - 1s - loss: 0.0124 - acc: 0.9958 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 86/200
 - 1s - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 87/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 88/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 89/200
 - 1s - loss: 0.0121 - acc: 0.9960 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 90/200
 - 1s - loss: 0.0118 - acc: 0.9967 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 91/200
 - 1s - loss: 0.0121 - acc: 0.9962 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 92/200
 - 1s - loss: 0.0123 - acc: 0.9965 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 93/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 94/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 95/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 96/200
 - 1s - loss: 0.0120 - acc: 0.9965 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 97/200
 - 1s - loss: 0.0123 - acc: 0.9960 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 98/200
 - 1s - loss: 0.0119 - acc: 0.9962 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 99/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 100/200
 - 1s - loss: 0.0125 - acc: 0.9961 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 101/200
 - 1s - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 102/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 103/200
 - 1s - loss: 0.0124 - acc: 0.9959 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 104/200
 - 1s - loss: 0.0119 - acc: 0.9961 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 105/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 106/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 107/200
 - 1s - loss: 0.0122 - acc: 0.9958 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 108/200
 - 1s - loss: 0.0121 - acc: 0.9962 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 109/200
 - 1s - loss: 0.0118 - acc: 0.9961 - val_loss: 0.0163 - val_acc: 0.9950
Epoch 110/200
 - 1s - loss: 0.0122 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 111/200
 - 1s - loss: 0.0121 - acc: 0.9961 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 112/200
 - 1s - loss: 0.0120 - acc: 0.9962 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 113/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 114/200
 - 1s - loss: 0.0122 - acc: 0.9960 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 115/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 116/200
 - 1s - loss: 0.0120 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 117/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 118/200
 - 1s - loss: 0.0120 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 119/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 120/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 121/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 122/200
 - 1s - loss: 0.0120 - acc: 0.9958 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 123/200
 - 1s - loss: 0.0117 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 124/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 125/200
 - 1s - loss: 0.0120 - acc: 0.9966 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 126/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9950
Epoch 127/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 128/200
 - 1s - loss: 0.0117 - acc: 0.9966 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 129/200
 - 1s - loss: 0.0120 - acc: 0.9962 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 130/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 131/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 132/200
 - 1s - loss: 0.0117 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 133/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 134/200
 - 1s - loss: 0.0120 - acc: 0.9962 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 135/200
 - 1s - loss: 0.0121 - acc: 0.9960 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 136/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 137/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 138/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 139/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 140/200
 - 1s - loss: 0.0116 - acc: 0.9967 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 141/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 142/200
 - 1s - loss: 0.0115 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 143/200
 - 1s - loss: 0.0119 - acc: 0.9961 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 144/200
 - 1s - loss: 0.0119 - acc: 0.9961 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 145/200
 - 1s - loss: 0.0115 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 146/200
 - 1s - loss: 0.0117 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 147/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 148/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 149/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 150/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 151/200
 - 1s - loss: 0.0117 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 152/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 153/200
 - 1s - loss: 0.0119 - acc: 0.9961 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 154/200
 - 1s - loss: 0.0117 - acc: 0.9966 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 155/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 156/200
 - 1s - loss: 0.0116 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 157/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 158/200
 - 1s - loss: 0.0118 - acc: 0.9961 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 159/200
 - 1s - loss: 0.0115 - acc: 0.9968 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 160/200
 - 1s - loss: 0.0116 - acc: 0.9964 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 161/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 162/200
 - 1s - loss: 0.0118 - acc: 0.9969 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 163/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 164/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 165/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 166/200
 - 1s - loss: 0.0117 - acc: 0.9960 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 167/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 168/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0162 - val_acc: 0.9952
Epoch 169/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 170/200
 - 1s - loss: 0.0117 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 171/200
 - 1s - loss: 0.0117 - acc: 0.9964 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 172/200
 - 1s - loss: 0.0115 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 173/200
 - 1s - loss: 0.0117 - acc: 0.9966 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 174/200
 - 1s - loss: 0.0114 - acc: 0.9962 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 175/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 176/200
 - 1s - loss: 0.0118 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9952
Epoch 177/200
 - 1s - loss: 0.0114 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 178/200
 - 1s - loss: 0.0119 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 179/200
 - 1s - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 180/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 181/200
 - 1s - loss: 0.0118 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 182/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 183/200
 - 1s - loss: 0.0115 - acc: 0.9968 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 184/200
 - 1s - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 185/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 186/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 187/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 188/200
 - 1s - loss: 0.0113 - acc: 0.9968 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 189/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 190/200
 - 1s - loss: 0.0114 - acc: 0.9962 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 191/200
 - 1s - loss: 0.0116 - acc: 0.9964 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 192/200
 - 1s - loss: 0.0116 - acc: 0.9960 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 193/200
 - 1s - loss: 0.0114 - acc: 0.9967 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 194/200
 - 1s - loss: 0.0118 - acc: 0.9962 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 195/200
 - 1s - loss: 0.0119 - acc: 0.9962 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 196/200
 - 1s - loss: 0.0114 - acc: 0.9964 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 197/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 198/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 199/200
 - 1s - loss: 0.0114 - acc: 0.9966 - val_loss: 0.0161 - val_acc: 0.9950
Epoch 200/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0161 - val_acc: 0.9950
2018-03-27 12:24:22,308 [INFO] Evaluate...
2018-03-27 12:24:27,017 [INFO] Done!
2018-03-27 12:24:27,024 [INFO] tpe_transform took 0.002484 seconds
2018-03-27 12:24:27,025 [INFO] TPE using 78/78 trials with best loss 0.011121
2018-03-27 12:24:27,033 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:24:28,019 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0663 - acc: 0.9727 - val_loss: 0.0305 - val_acc: 0.9890
Epoch 2/200
 - 1s - loss: 0.0340 - acc: 0.9881 - val_loss: 0.0287 - val_acc: 0.9902
Epoch 3/200
 - 1s - loss: 0.0315 - acc: 0.9896 - val_loss: 0.0280 - val_acc: 0.9906
Epoch 4/200
 - 1s - loss: 0.0300 - acc: 0.9897 - val_loss: 0.0278 - val_acc: 0.9906
Epoch 5/200
 - 1s - loss: 0.0299 - acc: 0.9901 - val_loss: 0.0275 - val_acc: 0.9908
Epoch 6/200
 - 1s - loss: 0.0289 - acc: 0.9909 - val_loss: 0.0272 - val_acc: 0.9908
Epoch 7/200
 - 1s - loss: 0.0286 - acc: 0.9908 - val_loss: 0.0269 - val_acc: 0.9908
Epoch 8/200
 - 1s - loss: 0.0270 - acc: 0.9909 - val_loss: 0.0267 - val_acc: 0.9906
Epoch 9/200
 - 1s - loss: 0.0273 - acc: 0.9905 - val_loss: 0.0266 - val_acc: 0.9906
Epoch 10/200
 - 1s - loss: 0.0281 - acc: 0.9904 - val_loss: 0.0265 - val_acc: 0.9906
Epoch 11/200
 - 1s - loss: 0.0276 - acc: 0.9909 - val_loss: 0.0264 - val_acc: 0.9906
Epoch 12/200
 - 1s - loss: 0.0275 - acc: 0.9905 - val_loss: 0.0263 - val_acc: 0.9906
Epoch 13/200
 - 1s - loss: 0.0283 - acc: 0.9909 - val_loss: 0.0262 - val_acc: 0.9904
Epoch 14/200
 - 1s - loss: 0.0273 - acc: 0.9909 - val_loss: 0.0261 - val_acc: 0.9902
Epoch 15/200
 - 1s - loss: 0.0271 - acc: 0.9908 - val_loss: 0.0260 - val_acc: 0.9904
Epoch 16/200
 - 1s - loss: 0.0278 - acc: 0.9912 - val_loss: 0.0260 - val_acc: 0.9902
Epoch 17/200
 - 1s - loss: 0.0269 - acc: 0.9913 - val_loss: 0.0260 - val_acc: 0.9902
Epoch 18/200
 - 1s - loss: 0.0273 - acc: 0.9913 - val_loss: 0.0259 - val_acc: 0.9902
Epoch 19/200
 - 1s - loss: 0.0264 - acc: 0.9908 - val_loss: 0.0259 - val_acc: 0.9902
Epoch 20/200
 - 1s - loss: 0.0253 - acc: 0.9910 - val_loss: 0.0258 - val_acc: 0.9904
Epoch 21/200
 - 1s - loss: 0.0268 - acc: 0.9909 - val_loss: 0.0257 - val_acc: 0.9904
Epoch 22/200
 - 1s - loss: 0.0267 - acc: 0.9906 - val_loss: 0.0256 - val_acc: 0.9904
Epoch 23/200
 - 1s - loss: 0.0269 - acc: 0.9905 - val_loss: 0.0256 - val_acc: 0.9904
Epoch 24/200
 - 1s - loss: 0.0262 - acc: 0.9911 - val_loss: 0.0256 - val_acc: 0.9904
Epoch 25/200
 - 1s - loss: 0.0265 - acc: 0.9919 - val_loss: 0.0255 - val_acc: 0.9904
Epoch 26/200
 - 1s - loss: 0.0262 - acc: 0.9917 - val_loss: 0.0255 - val_acc: 0.9904
Epoch 27/200
 - 1s - loss: 0.0258 - acc: 0.9917 - val_loss: 0.0255 - val_acc: 0.9904
Epoch 28/200
 - 1s - loss: 0.0256 - acc: 0.9916 - val_loss: 0.0254 - val_acc: 0.9904
Epoch 29/200
 - 1s - loss: 0.0254 - acc: 0.9913 - val_loss: 0.0254 - val_acc: 0.9904
Epoch 30/200
 - 1s - loss: 0.0256 - acc: 0.9920 - val_loss: 0.0254 - val_acc: 0.9904
Epoch 31/200
 - 1s - loss: 0.0260 - acc: 0.9914 - val_loss: 0.0254 - val_acc: 0.9904
Epoch 32/200
 - 1s - loss: 0.0243 - acc: 0.9919 - val_loss: 0.0253 - val_acc: 0.9904
Epoch 33/200
 - 1s - loss: 0.0261 - acc: 0.9914 - val_loss: 0.0253 - val_acc: 0.9904
Epoch 34/200
 - 1s - loss: 0.0250 - acc: 0.9922 - val_loss: 0.0253 - val_acc: 0.9904
Epoch 35/200
 - 1s - loss: 0.0255 - acc: 0.9919 - val_loss: 0.0253 - val_acc: 0.9904
Epoch 36/200
 - 1s - loss: 0.0250 - acc: 0.9918 - val_loss: 0.0252 - val_acc: 0.9904
Epoch 37/200
 - 1s - loss: 0.0257 - acc: 0.9920 - val_loss: 0.0252 - val_acc: 0.9904
Epoch 38/200
 - 1s - loss: 0.0263 - acc: 0.9915 - val_loss: 0.0252 - val_acc: 0.9904
Epoch 39/200
 - 1s - loss: 0.0266 - acc: 0.9917 - val_loss: 0.0252 - val_acc: 0.9904
Epoch 40/200
 - 1s - loss: 0.0261 - acc: 0.9911 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 41/200
 - 1s - loss: 0.0265 - acc: 0.9910 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 42/200
 - 1s - loss: 0.0266 - acc: 0.9910 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 43/200
 - 1s - loss: 0.0252 - acc: 0.9918 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 44/200
 - 1s - loss: 0.0255 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 45/200
 - 1s - loss: 0.0255 - acc: 0.9916 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 46/200
 - 1s - loss: 0.0253 - acc: 0.9922 - val_loss: 0.0251 - val_acc: 0.9904
Epoch 47/200
 - 1s - loss: 0.0258 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 48/200
 - 1s - loss: 0.0255 - acc: 0.9921 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 49/200
 - 1s - loss: 0.0253 - acc: 0.9917 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 50/200
 - 1s - loss: 0.0255 - acc: 0.9919 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 51/200
 - 1s - loss: 0.0259 - acc: 0.9914 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 52/200
 - 1s - loss: 0.0268 - acc: 0.9909 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 53/200
 - 1s - loss: 0.0258 - acc: 0.9908 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 54/200
 - 1s - loss: 0.0250 - acc: 0.9917 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 55/200
 - 1s - loss: 0.0257 - acc: 0.9909 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 56/200
 - 1s - loss: 0.0263 - acc: 0.9917 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 57/200
 - 1s - loss: 0.0256 - acc: 0.9915 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 58/200
 - 1s - loss: 0.0245 - acc: 0.9924 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 59/200
 - 1s - loss: 0.0248 - acc: 0.9916 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 60/200
 - 1s - loss: 0.0253 - acc: 0.9923 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 61/200
 - 1s - loss: 0.0249 - acc: 0.9924 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 62/200
 - 1s - loss: 0.0248 - acc: 0.9917 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 63/200
 - 1s - loss: 0.0256 - acc: 0.9918 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 64/200
 - 1s - loss: 0.0249 - acc: 0.9919 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 65/200
 - 1s - loss: 0.0266 - acc: 0.9909 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 66/200
 - 1s - loss: 0.0256 - acc: 0.9915 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 67/200
 - 1s - loss: 0.0260 - acc: 0.9913 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 68/200
 - 1s - loss: 0.0245 - acc: 0.9922 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 69/200
 - 1s - loss: 0.0249 - acc: 0.9917 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 70/200
 - 1s - loss: 0.0250 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 71/200
 - 1s - loss: 0.0264 - acc: 0.9912 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 72/200
 - 1s - loss: 0.0251 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 73/200
 - 1s - loss: 0.0253 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 74/200
 - 1s - loss: 0.0246 - acc: 0.9917 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 75/200
 - 1s - loss: 0.0245 - acc: 0.9914 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 76/200
 - 1s - loss: 0.0260 - acc: 0.9911 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 77/200
 - 1s - loss: 0.0253 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 78/200
 - 1s - loss: 0.0259 - acc: 0.9914 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 79/200
 - 1s - loss: 0.0245 - acc: 0.9914 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 80/200
 - 1s - loss: 0.0248 - acc: 0.9920 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 81/200
 - 1s - loss: 0.0244 - acc: 0.9926 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 82/200
 - 1s - loss: 0.0262 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 83/200
 - 1s - loss: 0.0255 - acc: 0.9915 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 84/200
 - 1s - loss: 0.0251 - acc: 0.9915 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 85/200
 - 1s - loss: 0.0265 - acc: 0.9908 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 86/200
 - 1s - loss: 0.0249 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 87/200
 - 1s - loss: 0.0237 - acc: 0.9923 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 88/200
 - 1s - loss: 0.0253 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 89/200
 - 1s - loss: 0.0250 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 90/200
 - 1s - loss: 0.0247 - acc: 0.9920 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 91/200
 - 1s - loss: 0.0250 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 92/200
 - 1s - loss: 0.0246 - acc: 0.9924 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 93/200
 - 1s - loss: 0.0250 - acc: 0.9914 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 94/200
 - 1s - loss: 0.0253 - acc: 0.9916 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 95/200
 - 1s - loss: 0.0249 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 96/200
 - 1s - loss: 0.0248 - acc: 0.9916 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 97/200
 - 1s - loss: 0.0252 - acc: 0.9917 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 98/200
 - 1s - loss: 0.0240 - acc: 0.9928 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 99/200
 - 1s - loss: 0.0253 - acc: 0.9915 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 100/200
 - 1s - loss: 0.0250 - acc: 0.9921 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 101/200
 - 1s - loss: 0.0246 - acc: 0.9924 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 102/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 103/200
 - 1s - loss: 0.0254 - acc: 0.9916 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 104/200
 - 1s - loss: 0.0245 - acc: 0.9918 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 105/200
 - 1s - loss: 0.0250 - acc: 0.9926 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 106/200
 - 1s - loss: 0.0250 - acc: 0.9908 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 107/200
 - 1s - loss: 0.0243 - acc: 0.9920 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 108/200
 - 1s - loss: 0.0236 - acc: 0.9921 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 109/200
 - 1s - loss: 0.0254 - acc: 0.9914 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 110/200
 - 1s - loss: 0.0252 - acc: 0.9915 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 111/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 112/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 113/200
 - 1s - loss: 0.0255 - acc: 0.9917 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 114/200
 - 1s - loss: 0.0248 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 115/200
 - 1s - loss: 0.0252 - acc: 0.9921 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 116/200
 - 1s - loss: 0.0244 - acc: 0.9922 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 117/200
 - 1s - loss: 0.0242 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 118/200
 - 1s - loss: 0.0247 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 119/200
 - 1s - loss: 0.0254 - acc: 0.9918 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 120/200
 - 1s - loss: 0.0248 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 121/200
 - 1s - loss: 0.0261 - acc: 0.9913 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 122/200
 - 1s - loss: 0.0245 - acc: 0.9920 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 123/200
 - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 124/200
 - 1s - loss: 0.0243 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 125/200
 - 1s - loss: 0.0259 - acc: 0.9915 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 126/200
 - 1s - loss: 0.0238 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 127/200
 - 1s - loss: 0.0240 - acc: 0.9923 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 128/200
 - 1s - loss: 0.0250 - acc: 0.9914 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 129/200
 - 1s - loss: 0.0254 - acc: 0.9916 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 130/200
 - 1s - loss: 0.0256 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 131/200
 - 1s - loss: 0.0253 - acc: 0.9912 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 132/200
 - 1s - loss: 0.0240 - acc: 0.9921 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 133/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 134/200
 - 1s - loss: 0.0245 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 135/200
 - 1s - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 136/200
 - 1s - loss: 0.0247 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 137/200
 - 1s - loss: 0.0243 - acc: 0.9920 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 138/200
 - 1s - loss: 0.0248 - acc: 0.9918 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 139/200
 - 1s - loss: 0.0241 - acc: 0.9918 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 140/200
 - 1s - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 141/200
 - 1s - loss: 0.0240 - acc: 0.9928 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 142/200
 - 1s - loss: 0.0258 - acc: 0.9916 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 143/200
 - 1s - loss: 0.0257 - acc: 0.9909 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 144/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 145/200
 - 1s - loss: 0.0247 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 146/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 147/200
 - 1s - loss: 0.0252 - acc: 0.9915 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 148/200
 - 1s - loss: 0.0250 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 149/200
 - 1s - loss: 0.0243 - acc: 0.9926 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 150/200
 - 1s - loss: 0.0252 - acc: 0.9918 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 151/200
 - 1s - loss: 0.0248 - acc: 0.9910 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 152/200
 - 1s - loss: 0.0240 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 153/200
 - 1s - loss: 0.0243 - acc: 0.9915 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 154/200
 - 1s - loss: 0.0253 - acc: 0.9919 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 155/200
 - 1s - loss: 0.0241 - acc: 0.9921 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 156/200
 - 1s - loss: 0.0244 - acc: 0.9921 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 157/200
 - 1s - loss: 0.0240 - acc: 0.9922 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 158/200
 - 1s - loss: 0.0246 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 159/200
 - 1s - loss: 0.0244 - acc: 0.9914 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 160/200
 - 1s - loss: 0.0253 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 161/200
 - 1s - loss: 0.0247 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 162/200
 - 1s - loss: 0.0245 - acc: 0.9925 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 163/200
 - 1s - loss: 0.0239 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 164/200
 - 1s - loss: 0.0243 - acc: 0.9922 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 165/200
 - 1s - loss: 0.0247 - acc: 0.9917 - val_loss: 0.0242 - val_acc: 0.9904
Epoch 166/200
 - 1s - loss: 0.0241 - acc: 0.9921 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 167/200
 - 1s - loss: 0.0251 - acc: 0.9915 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 168/200
 - 1s - loss: 0.0243 - acc: 0.9924 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 169/200
 - 1s - loss: 0.0242 - acc: 0.9921 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 170/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 171/200
 - 1s - loss: 0.0243 - acc: 0.9920 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 172/200
 - 1s - loss: 0.0234 - acc: 0.9923 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 173/200
 - 1s - loss: 0.0249 - acc: 0.9916 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 174/200
 - 1s - loss: 0.0245 - acc: 0.9919 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 175/200
 - 1s - loss: 0.0237 - acc: 0.9924 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 176/200
 - 1s - loss: 0.0241 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 177/200
 - 1s - loss: 0.0237 - acc: 0.9926 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 178/200
 - 1s - loss: 0.0240 - acc: 0.9920 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 179/200
 - 1s - loss: 0.0235 - acc: 0.9926 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 180/200
 - 1s - loss: 0.0246 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 181/200
 - 1s - loss: 0.0251 - acc: 0.9916 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 182/200
 - 1s - loss: 0.0236 - acc: 0.9922 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 183/200
 - 1s - loss: 0.0248 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 184/200
 - 1s - loss: 0.0255 - acc: 0.9912 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 185/200
 - 1s - loss: 0.0240 - acc: 0.9924 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 186/200
 - 1s - loss: 0.0258 - acc: 0.9913 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 187/200
 - 1s - loss: 0.0244 - acc: 0.9923 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 188/200
 - 1s - loss: 0.0235 - acc: 0.9924 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 189/200
 - 1s - loss: 0.0244 - acc: 0.9925 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 190/200
 - 1s - loss: 0.0240 - acc: 0.9917 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 191/200
 - 1s - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 192/200
 - 1s - loss: 0.0248 - acc: 0.9921 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 193/200
 - 1s - loss: 0.0249 - acc: 0.9919 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 194/200
 - 1s - loss: 0.0247 - acc: 0.9920 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 195/200
 - 1s - loss: 0.0247 - acc: 0.9917 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 196/200
 - 1s - loss: 0.0241 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 197/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 198/200
 - 1s - loss: 0.0248 - acc: 0.9910 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 199/200
 - 1s - loss: 0.0241 - acc: 0.9921 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 200/200
 - 1s - loss: 0.0242 - acc: 0.9918 - val_loss: 0.0241 - val_acc: 0.9906
2018-03-27 12:27:50,679 [INFO] Evaluate...
2018-03-27 12:27:55,442 [INFO] Done!
2018-03-27 12:27:55,451 [INFO] tpe_transform took 0.004178 seconds
2018-03-27 12:27:55,452 [INFO] TPE using 79/79 trials with best loss 0.011121
2018-03-27 12:27:55,459 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:27:56,445 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0604 - acc: 0.9786 - val_loss: 0.0257 - val_acc: 0.9916
Epoch 2/200
 - 1s - loss: 0.0262 - acc: 0.9921 - val_loss: 0.0230 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0238 - acc: 0.9919 - val_loss: 0.0213 - val_acc: 0.9932
Epoch 4/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 5/200
 - 1s - loss: 0.0217 - acc: 0.9926 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 6/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9942
Epoch 7/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0195 - val_acc: 0.9938
Epoch 8/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0196 - val_acc: 0.9942
Epoch 9/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 11/200
 - 1s - loss: 0.0169 - acc: 0.9951 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 12/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 13/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0168 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 16/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 17/200
 - 1s - loss: 0.0158 - acc: 0.9947 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 18/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 19/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0152 - acc: 0.9948 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 21/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0187 - val_acc: 0.9942
Epoch 22/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0187 - val_acc: 0.9942
Epoch 23/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0185 - val_acc: 0.9942
Epoch 24/200
 - 1s - loss: 0.0158 - acc: 0.9947 - val_loss: 0.0187 - val_acc: 0.9942
Epoch 25/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0185 - val_acc: 0.9940
Epoch 26/200
 - 1s - loss: 0.0159 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9942
Epoch 27/200
 - 1s - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0186 - val_acc: 0.9942
Epoch 28/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0185 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0186 - val_acc: 0.9942
Epoch 30/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0185 - val_acc: 0.9940
Epoch 31/200
 - 1s - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 33/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 34/200
 - 1s - loss: 0.0144 - acc: 0.9953 - val_loss: 0.0183 - val_acc: 0.9942
Epoch 35/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 37/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0183 - val_acc: 0.9942
Epoch 38/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 39/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0184 - val_acc: 0.9940
Epoch 40/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0183 - val_acc: 0.9942
Epoch 41/200
 - 1s - loss: 0.0149 - acc: 0.9953 - val_loss: 0.0183 - val_acc: 0.9942
Epoch 42/200
 - 1s - loss: 0.0143 - acc: 0.9951 - val_loss: 0.0183 - val_acc: 0.9942
2018-03-27 12:28:54,040 [INFO] Evaluate...
2018-03-27 12:28:58,872 [INFO] Done!
2018-03-27 12:28:58,879 [INFO] tpe_transform took 0.002507 seconds
2018-03-27 12:28:58,880 [INFO] TPE using 80/80 trials with best loss 0.011121
2018-03-27 12:28:58,888 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:28:59,876 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0434 - acc: 0.9820 - val_loss: 0.0221 - val_acc: 0.9924
Epoch 2/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0206 - val_acc: 0.9928
Epoch 3/200
 - 1s - loss: 0.0196 - acc: 0.9932 - val_loss: 0.0198 - val_acc: 0.9930
Epoch 4/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9930
Epoch 5/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0191 - val_acc: 0.9930
Epoch 6/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 9/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0186 - val_acc: 0.9932
Epoch 10/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9932
Epoch 11/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0184 - val_acc: 0.9930
Epoch 12/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9928
Epoch 13/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9928
Epoch 14/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9928
Epoch 15/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9930
Epoch 16/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 17/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9932
Epoch 18/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9932
Epoch 19/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9932
Epoch 20/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0180 - val_acc: 0.9934
Epoch 21/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9932
Epoch 22/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9932
Epoch 24/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 25/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0178 - val_acc: 0.9932
Epoch 26/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9934
Epoch 29/200
 - 1s - loss: 0.0154 - acc: 0.9949 - val_loss: 0.0177 - val_acc: 0.9934
Epoch 30/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9934
Epoch 31/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9934
Epoch 32/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9934
Epoch 33/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9934
Epoch 34/200
 - 1s - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 35/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 36/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 37/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 38/200
 - 1s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 39/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0176 - val_acc: 0.9934
Epoch 40/200
 - 1s - loss: 0.0154 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 41/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0175 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 43/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 44/200
 - 1s - loss: 0.0155 - acc: 0.9956 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 45/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 46/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 47/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 48/200
 - 1s - loss: 0.0152 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 49/200
 - 1s - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9936
Epoch 50/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 51/200
 - 1s - loss: 0.0150 - acc: 0.9952 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 52/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 53/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 54/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 55/200
 - 1s - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 56/200
 - 1s - loss: 0.0151 - acc: 0.9947 - val_loss: 0.0174 - val_acc: 0.9938
Epoch 57/200
 - 1s - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0174 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0174 - val_acc: 0.9936
Epoch 59/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0174 - val_acc: 0.9936
Epoch 60/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 61/200
 - 1s - loss: 0.0154 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 62/200
 - 1s - loss: 0.0152 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 63/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 64/200
 - 1s - loss: 0.0149 - acc: 0.9957 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 65/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 66/200
 - 1s - loss: 0.0149 - acc: 0.9957 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 67/200
 - 1s - loss: 0.0150 - acc: 0.9957 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 68/200
 - 1s - loss: 0.0148 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 69/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 70/200
 - 1s - loss: 0.0150 - acc: 0.9953 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 71/200
 - 1s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 72/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 73/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 74/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 75/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 76/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 77/200
 - 1s - loss: 0.0149 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 78/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 79/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 80/200
 - 1s - loss: 0.0148 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 81/200
 - 1s - loss: 0.0151 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 82/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 85/200
 - 1s - loss: 0.0149 - acc: 0.9952 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 86/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 87/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 88/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 89/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 90/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 91/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 92/200
 - 1s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 93/200
 - 1s - loss: 0.0147 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9938
Epoch 94/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 95/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 96/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 97/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 98/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 99/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 100/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 101/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 102/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 103/200
 - 1s - loss: 0.0145 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 104/200
 - 1s - loss: 0.0149 - acc: 0.9954 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 105/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 106/200
 - 1s - loss: 0.0149 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 107/200
 - 1s - loss: 0.0148 - acc: 0.9953 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 108/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 109/200
 - 1s - loss: 0.0148 - acc: 0.9953 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 110/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 111/200
 - 1s - loss: 0.0146 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 112/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 113/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 114/200
 - 1s - loss: 0.0148 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 115/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 116/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 117/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 118/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 119/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0148 - acc: 0.9954 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0147 - acc: 0.9954 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0145 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 127/200
 - 1s - loss: 0.0147 - acc: 0.9951 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 128/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 129/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 131/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 132/200
 - 1s - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 133/200
 - 1s - loss: 0.0146 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 134/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 135/200
 - 1s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 136/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 137/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 138/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 139/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 140/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 142/200
 - 1s - loss: 0.0145 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0144 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 145/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 147/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 148/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 149/200
 - 1s - loss: 0.0144 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 150/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 151/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 152/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 153/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 154/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 155/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 156/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 157/200
 - 1s - loss: 0.0147 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 158/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 159/200
 - 1s - loss: 0.0145 - acc: 0.9953 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 160/200
 - 1s - loss: 0.0141 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 161/200
 - 1s - loss: 0.0144 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 162/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 163/200
 - 1s - loss: 0.0143 - acc: 0.9954 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 164/200
 - 1s - loss: 0.0143 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 165/200
 - 1s - loss: 0.0139 - acc: 0.9961 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 166/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 167/200
 - 1s - loss: 0.0144 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 168/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 169/200
 - 1s - loss: 0.0146 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 170/200
 - 1s - loss: 0.0147 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 171/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 172/200
 - 1s - loss: 0.0143 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 173/200
 - 1s - loss: 0.0144 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 174/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 175/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 177/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 178/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 179/200
 - 1s - loss: 0.0141 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 180/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 181/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 182/200
 - 1s - loss: 0.0145 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 183/200
 - 1s - loss: 0.0147 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 184/200
 - 1s - loss: 0.0144 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 185/200
 - 1s - loss: 0.0142 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 186/200
 - 1s - loss: 0.0148 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 187/200
 - 1s - loss: 0.0142 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 188/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 189/200
 - 1s - loss: 0.0141 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0141 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 191/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 192/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 193/200
 - 1s - loss: 0.0143 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 194/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 195/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 196/200
 - 1s - loss: 0.0142 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 197/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 198/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 199/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 200/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9940
2018-03-27 12:32:22,896 [INFO] Evaluate...
2018-03-27 12:32:27,779 [INFO] Done!
2018-03-27 12:32:27,785 [INFO] tpe_transform took 0.002531 seconds
2018-03-27 12:32:27,786 [INFO] TPE using 81/81 trials with best loss 0.011121
2018-03-27 12:32:27,794 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:32:28,780 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0449 - acc: 0.9811 - val_loss: 0.0218 - val_acc: 0.9928
Epoch 2/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0202 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0193 - val_acc: 0.9924
Epoch 4/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0190 - val_acc: 0.9928
Epoch 5/200
 - 1s - loss: 0.0139 - acc: 0.9961 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 6/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0188 - val_acc: 0.9932
Epoch 7/200
 - 1s - loss: 0.0131 - acc: 0.9963 - val_loss: 0.0186 - val_acc: 0.9932
Epoch 8/200
 - 1s - loss: 0.0125 - acc: 0.9960 - val_loss: 0.0185 - val_acc: 0.9932
Epoch 9/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0185 - val_acc: 0.9934
Epoch 10/200
 - 1s - loss: 0.0125 - acc: 0.9965 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 11/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0184 - val_acc: 0.9936
Epoch 12/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 13/200
 - 1s - loss: 0.0115 - acc: 0.9967 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 14/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0183 - val_acc: 0.9936
Epoch 15/200
 - 1s - loss: 0.0115 - acc: 0.9969 - val_loss: 0.0183 - val_acc: 0.9938
Epoch 16/200
 - 1s - loss: 0.0114 - acc: 0.9968 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0112 - acc: 0.9969 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 18/200
 - 1s - loss: 0.0110 - acc: 0.9972 - val_loss: 0.0183 - val_acc: 0.9938
Epoch 19/200
 - 1s - loss: 0.0112 - acc: 0.9972 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 20/200
 - 1s - loss: 0.0109 - acc: 0.9972 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 21/200
 - 1s - loss: 0.0109 - acc: 0.9970 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 22/200
 - 1s - loss: 0.0109 - acc: 0.9967 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 23/200
 - 1s - loss: 0.0110 - acc: 0.9970 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0104 - acc: 0.9970 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 25/200
 - 1s - loss: 0.0104 - acc: 0.9974 - val_loss: 0.0182 - val_acc: 0.9936
Epoch 26/200
 - 1s - loss: 0.0104 - acc: 0.9973 - val_loss: 0.0182 - val_acc: 0.9940
Epoch 27/200
 - 1s - loss: 0.0103 - acc: 0.9972 - val_loss: 0.0182 - val_acc: 0.9940
Epoch 28/200
 - 1s - loss: 0.0106 - acc: 0.9972 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 29/200
 - 1s - loss: 0.0101 - acc: 0.9974 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0102 - acc: 0.9970 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0104 - acc: 0.9967 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 32/200
 - 1s - loss: 0.0097 - acc: 0.9974 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0103 - acc: 0.9972 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 34/200
 - 1s - loss: 0.0099 - acc: 0.9974 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0102 - acc: 0.9972 - val_loss: 0.0181 - val_acc: 0.9936
Epoch 36/200
 - 1s - loss: 0.0099 - acc: 0.9972 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0100 - acc: 0.9974 - val_loss: 0.0181 - val_acc: 0.9940
Epoch 38/200
 - 1s - loss: 0.0099 - acc: 0.9972 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0100 - acc: 0.9972 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0099 - acc: 0.9974 - val_loss: 0.0181 - val_acc: 0.9940
Epoch 41/200
 - 1s - loss: 0.0101 - acc: 0.9972 - val_loss: 0.0181 - val_acc: 0.9938
Epoch 42/200
 - 1s - loss: 0.0096 - acc: 0.9973 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 43/200
 - 1s - loss: 0.0096 - acc: 0.9979 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0098 - acc: 0.9972 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 45/200
 - 1s - loss: 0.0099 - acc: 0.9972 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0100 - acc: 0.9974 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 47/200
 - 1s - loss: 0.0095 - acc: 0.9974 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 48/200
 - 1s - loss: 0.0099 - acc: 0.9975 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 49/200
 - 1s - loss: 0.0096 - acc: 0.9976 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0095 - acc: 0.9974 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 51/200
 - 1s - loss: 0.0094 - acc: 0.9972 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 52/200
 - 1s - loss: 0.0093 - acc: 0.9979 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 53/200
 - 1s - loss: 0.0095 - acc: 0.9974 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0095 - acc: 0.9976 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0091 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 56/200
 - 1s - loss: 0.0090 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 57/200
 - 1s - loss: 0.0093 - acc: 0.9978 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 58/200
 - 1s - loss: 0.0093 - acc: 0.9974 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 59/200
 - 1s - loss: 0.0096 - acc: 0.9978 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 60/200
 - 1s - loss: 0.0090 - acc: 0.9976 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 61/200
 - 1s - loss: 0.0094 - acc: 0.9976 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0094 - acc: 0.9973 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 63/200
 - 1s - loss: 0.0095 - acc: 0.9972 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 64/200
 - 1s - loss: 0.0091 - acc: 0.9973 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 65/200
 - 1s - loss: 0.0094 - acc: 0.9975 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 66/200
 - 1s - loss: 0.0091 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 67/200
 - 1s - loss: 0.0093 - acc: 0.9975 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 68/200
 - 1s - loss: 0.0092 - acc: 0.9976 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 69/200
 - 1s - loss: 0.0087 - acc: 0.9981 - val_loss: 0.0180 - val_acc: 0.9940
Epoch 70/200
 - 1s - loss: 0.0088 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 71/200
 - 1s - loss: 0.0088 - acc: 0.9978 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 72/200
 - 1s - loss: 0.0091 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 73/200
 - 1s - loss: 0.0091 - acc: 0.9979 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 74/200
 - 1s - loss: 0.0088 - acc: 0.9978 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 75/200
 - 1s - loss: 0.0088 - acc: 0.9978 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 76/200
 - 1s - loss: 0.0093 - acc: 0.9977 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 77/200
 - 1s - loss: 0.0092 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 78/200
 - 1s - loss: 0.0090 - acc: 0.9978 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 79/200
 - 1s - loss: 0.0089 - acc: 0.9975 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 80/200
 - 1s - loss: 0.0090 - acc: 0.9976 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 81/200
 - 1s - loss: 0.0088 - acc: 0.9978 - val_loss: 0.0180 - val_acc: 0.9942
Epoch 82/200
 - 1s - loss: 0.0088 - acc: 0.9977 - val_loss: 0.0180 - val_acc: 0.9942
2018-03-27 12:34:03,100 [INFO] Evaluate...
2018-03-27 12:34:08,062 [INFO] Done!
2018-03-27 12:34:08,069 [INFO] tpe_transform took 0.003342 seconds
2018-03-27 12:34:08,070 [INFO] TPE using 82/82 trials with best loss 0.011121
2018-03-27 12:34:08,079 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:34:09,066 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0623 - acc: 0.9735 - val_loss: 0.0245 - val_acc: 0.9908
Epoch 2/200
 - 1s - loss: 0.0259 - acc: 0.9919 - val_loss: 0.0218 - val_acc: 0.9916
Epoch 3/200
 - 1s - loss: 0.0231 - acc: 0.9923 - val_loss: 0.0208 - val_acc: 0.9924
Epoch 4/200
 - 1s - loss: 0.0219 - acc: 0.9928 - val_loss: 0.0203 - val_acc: 0.9920
Epoch 5/200
 - 1s - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0196 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9932
Epoch 8/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9930
Epoch 10/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0187 - val_acc: 0.9930
Epoch 11/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0186 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0184 - val_acc: 0.9934
Epoch 13/200
 - 1s - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0184 - val_acc: 0.9928
Epoch 14/200
 - 1s - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9928
Epoch 15/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0182 - val_acc: 0.9926
Epoch 16/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0181 - val_acc: 0.9930
Epoch 17/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0181 - val_acc: 0.9928
Epoch 18/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0180 - acc: 0.9943 - val_loss: 0.0179 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0179 - val_acc: 0.9926
Epoch 21/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0178 - val_acc: 0.9928
Epoch 22/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9926
Epoch 23/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0178 - val_acc: 0.9928
Epoch 24/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0178 - val_acc: 0.9926
Epoch 25/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9926
Epoch 26/200
 - 1s - loss: 0.0180 - acc: 0.9948 - val_loss: 0.0177 - val_acc: 0.9926
Epoch 27/200
 - 1s - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9926
Epoch 28/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0176 - val_acc: 0.9926
Epoch 29/200
 - 1s - loss: 0.0178 - acc: 0.9943 - val_loss: 0.0175 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9926
Epoch 31/200
 - 1s - loss: 0.0172 - acc: 0.9948 - val_loss: 0.0175 - val_acc: 0.9926
Epoch 32/200
 - 1s - loss: 0.0174 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9926
Epoch 33/200
 - 1s - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9926
Epoch 34/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9926
Epoch 35/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0174 - val_acc: 0.9926
Epoch 36/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0174 - val_acc: 0.9926
Epoch 37/200
 - 1s - loss: 0.0173 - acc: 0.9944 - val_loss: 0.0174 - val_acc: 0.9926
Epoch 38/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0173 - val_acc: 0.9926
Epoch 39/200
 - 1s - loss: 0.0174 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9926
Epoch 40/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9926
Epoch 41/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0173 - val_acc: 0.9926
Epoch 42/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9926
Epoch 43/200
 - 1s - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0172 - val_acc: 0.9926
Epoch 44/200
 - 1s - loss: 0.0175 - acc: 0.9943 - val_loss: 0.0172 - val_acc: 0.9926
Epoch 45/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0173 - val_acc: 0.9928
Epoch 46/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0172 - val_acc: 0.9926
Epoch 47/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0172 - val_acc: 0.9926
Epoch 48/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0171 - val_acc: 0.9926
Epoch 49/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9926
Epoch 50/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9928
Epoch 51/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0171 - val_acc: 0.9928
Epoch 52/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9926
Epoch 53/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0171 - val_acc: 0.9926
Epoch 54/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0171 - val_acc: 0.9926
Epoch 55/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0171 - val_acc: 0.9928
Epoch 56/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9926
Epoch 57/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9928
Epoch 58/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0170 - val_acc: 0.9928
Epoch 59/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9926
Epoch 60/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9926
Epoch 61/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9926
Epoch 62/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0170 - val_acc: 0.9928
Epoch 63/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0170 - val_acc: 0.9926
Epoch 64/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9928
Epoch 65/200
 - 1s - loss: 0.0167 - acc: 0.9940 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 66/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9926
Epoch 67/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 68/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 69/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 70/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 71/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 72/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0169 - val_acc: 0.9926
Epoch 73/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9926
Epoch 74/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9928
Epoch 75/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0168 - val_acc: 0.9930
Epoch 76/200
 - 1s - loss: 0.0166 - acc: 0.9945 - val_loss: 0.0168 - val_acc: 0.9930
Epoch 77/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0168 - val_acc: 0.9930
Epoch 78/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9930
Epoch 79/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9926
Epoch 81/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9926
Epoch 82/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9926
Epoch 83/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9926
Epoch 84/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0168 - val_acc: 0.9928
Epoch 85/200
 - 1s - loss: 0.0170 - acc: 0.9951 - val_loss: 0.0168 - val_acc: 0.9926
Epoch 86/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0168 - val_acc: 0.9926
Epoch 87/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0168 - val_acc: 0.9928
Epoch 88/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 89/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 90/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 91/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 92/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 93/200
 - 1s - loss: 0.0165 - acc: 0.9954 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 94/200
 - 1s - loss: 0.0169 - acc: 0.9945 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 95/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 96/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 97/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 98/200
 - 1s - loss: 0.0162 - acc: 0.9955 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 99/200
 - 1s - loss: 0.0163 - acc: 0.9956 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 100/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 101/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 102/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 103/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 104/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 105/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 106/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9928
Epoch 107/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0167 - val_acc: 0.9926
Epoch 108/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0167 - val_acc: 0.9926
Epoch 109/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 110/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 111/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 112/200
 - 1s - loss: 0.0168 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 113/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 114/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 115/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 116/200
 - 1s - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 117/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 118/200
 - 1s - loss: 0.0165 - acc: 0.9954 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 119/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 120/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 121/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9926
Epoch 122/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 123/200
 - 1s - loss: 0.0162 - acc: 0.9945 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 124/200
 - 1s - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 125/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0166 - val_acc: 0.9928
Epoch 126/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 127/200
 - 1s - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 128/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 129/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 130/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 131/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 132/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 133/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 134/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 135/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 136/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 137/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 138/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 139/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 140/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 141/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 142/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 143/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 144/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 145/200
 - 1s - loss: 0.0168 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 146/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 147/200
 - 1s - loss: 0.0154 - acc: 0.9956 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 148/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 149/200
 - 1s - loss: 0.0153 - acc: 0.9956 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 150/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 151/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 152/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 153/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 154/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 155/200
 - 1s - loss: 0.0155 - acc: 0.9949 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 156/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9928
Epoch 157/200
 - 1s - loss: 0.0162 - acc: 0.9948 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 158/200
 - 1s - loss: 0.0156 - acc: 0.9956 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 159/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 160/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 161/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 162/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 163/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 164/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 165/200
 - 1s - loss: 0.0153 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 166/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 167/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 168/200
 - 1s - loss: 0.0153 - acc: 0.9958 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 169/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 170/200
 - 1s - loss: 0.0160 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 171/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 172/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 173/200
 - 1s - loss: 0.0154 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 174/200
 - 1s - loss: 0.0156 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 175/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 176/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 177/200
 - 1s - loss: 0.0159 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 178/200
 - 1s - loss: 0.0159 - acc: 0.9956 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 179/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 180/200
 - 1s - loss: 0.0156 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 181/200
 - 1s - loss: 0.0157 - acc: 0.9952 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 182/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 183/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 184/200
 - 1s - loss: 0.0161 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 185/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 186/200
 - 1s - loss: 0.0157 - acc: 0.9957 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 187/200
 - 1s - loss: 0.0156 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 188/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 189/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 190/200
 - 1s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0164 - val_acc: 0.9928
Epoch 191/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 192/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 193/200
 - 1s - loss: 0.0155 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 194/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 195/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 196/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 197/200
 - 1s - loss: 0.0152 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 198/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 199/200
 - 1s - loss: 0.0157 - acc: 0.9954 - val_loss: 0.0163 - val_acc: 0.9928
Epoch 200/200
 - 1s - loss: 0.0151 - acc: 0.9957 - val_loss: 0.0163 - val_acc: 0.9928
2018-03-27 12:37:32,538 [INFO] Evaluate...
2018-03-27 12:37:37,544 [INFO] Done!
2018-03-27 12:37:37,550 [INFO] tpe_transform took 0.002455 seconds
2018-03-27 12:37:37,551 [INFO] TPE using 83/83 trials with best loss 0.011121
2018-03-27 12:37:37,558 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:37:38,551 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0545 - acc: 0.9782 - val_loss: 0.0194 - val_acc: 0.9934
Epoch 2/200
 - 1s - loss: 0.0251 - acc: 0.9912 - val_loss: 0.0184 - val_acc: 0.9938
Epoch 3/200
 - 1s - loss: 0.0236 - acc: 0.9921 - val_loss: 0.0174 - val_acc: 0.9944
Epoch 4/200
 - 1s - loss: 0.0224 - acc: 0.9927 - val_loss: 0.0170 - val_acc: 0.9942
Epoch 5/200
 - 1s - loss: 0.0215 - acc: 0.9925 - val_loss: 0.0166 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 7/200
 - 1s - loss: 0.0213 - acc: 0.9928 - val_loss: 0.0164 - val_acc: 0.9946
Epoch 8/200
 - 1s - loss: 0.0207 - acc: 0.9933 - val_loss: 0.0161 - val_acc: 0.9946
Epoch 9/200
 - 1s - loss: 0.0201 - acc: 0.9931 - val_loss: 0.0161 - val_acc: 0.9948
Epoch 10/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0159 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0158 - val_acc: 0.9948
Epoch 13/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0158 - val_acc: 0.9946
Epoch 14/200
 - 1s - loss: 0.0196 - acc: 0.9939 - val_loss: 0.0157 - val_acc: 0.9946
Epoch 15/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0156 - val_acc: 0.9950
Epoch 16/200
 - 1s - loss: 0.0186 - acc: 0.9941 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 17/200
 - 1s - loss: 0.0185 - acc: 0.9939 - val_loss: 0.0155 - val_acc: 0.9950
Epoch 18/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0155 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0156 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0155 - val_acc: 0.9948
Epoch 21/200
 - 1s - loss: 0.0183 - acc: 0.9938 - val_loss: 0.0155 - val_acc: 0.9950
Epoch 22/200
 - 1s - loss: 0.0184 - acc: 0.9939 - val_loss: 0.0154 - val_acc: 0.9954
Epoch 23/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 24/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 25/200
 - 1s - loss: 0.0184 - acc: 0.9942 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 28/200
 - 1s - loss: 0.0177 - acc: 0.9939 - val_loss: 0.0154 - val_acc: 0.9952
Epoch 29/200
 - 1s - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 30/200
 - 1s - loss: 0.0181 - acc: 0.9938 - val_loss: 0.0154 - val_acc: 0.9950
Epoch 31/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 32/200
 - 1s - loss: 0.0188 - acc: 0.9940 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 33/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 34/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0153 - val_acc: 0.9950
Epoch 35/200
 - 1s - loss: 0.0179 - acc: 0.9946 - val_loss: 0.0153 - val_acc: 0.9952
Epoch 36/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 37/200
 - 1s - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 38/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 39/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 40/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 41/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 42/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 43/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 44/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0152 - val_acc: 0.9952
Epoch 45/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 46/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 47/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 48/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 49/200
 - 1s - loss: 0.0165 - acc: 0.9938 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 50/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 51/200
 - 1s - loss: 0.0178 - acc: 0.9942 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 52/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 53/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 54/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 55/200
 - 1s - loss: 0.0192 - acc: 0.9935 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 56/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0151 - val_acc: 0.9952
Epoch 57/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 58/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 59/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 60/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 61/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 62/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 63/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 64/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 65/200
 - 1s - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 66/200
 - 1s - loss: 0.0166 - acc: 0.9943 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 67/200
 - 1s - loss: 0.0171 - acc: 0.9941 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 68/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 69/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 70/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 71/200
 - 1s - loss: 0.0168 - acc: 0.9944 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 72/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0150 - val_acc: 0.9952
Epoch 73/200
 - 1s - loss: 0.0169 - acc: 0.9942 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 74/200
 - 1s - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 75/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 76/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 77/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 78/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 79/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 80/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 81/200
 - 1s - loss: 0.0169 - acc: 0.9944 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 82/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 83/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 84/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 85/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 86/200
 - 1s - loss: 0.0173 - acc: 0.9941 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 87/200
 - 1s - loss: 0.0162 - acc: 0.9944 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 88/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 89/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 90/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 91/200
 - 1s - loss: 0.0168 - acc: 0.9940 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 92/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 93/200
 - 1s - loss: 0.0168 - acc: 0.9942 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 94/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 95/200
 - 1s - loss: 0.0162 - acc: 0.9955 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 96/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 97/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 98/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 99/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0149 - val_acc: 0.9952
Epoch 100/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 101/200
 - 1s - loss: 0.0158 - acc: 0.9946 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 102/200
 - 1s - loss: 0.0173 - acc: 0.9940 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 103/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 104/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 105/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 106/200
 - 1s - loss: 0.0170 - acc: 0.9945 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 107/200
 - 1s - loss: 0.0158 - acc: 0.9954 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 108/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 109/200
 - 1s - loss: 0.0153 - acc: 0.9954 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 110/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 111/200
 - 1s - loss: 0.0160 - acc: 0.9948 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 112/200
 - 1s - loss: 0.0161 - acc: 0.9946 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 113/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 114/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 115/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 116/200
 - 1s - loss: 0.0172 - acc: 0.9943 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 117/200
 - 1s - loss: 0.0166 - acc: 0.9944 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 118/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 119/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0148 - val_acc: 0.9952
Epoch 120/200
 - 1s - loss: 0.0161 - acc: 0.9947 - val_loss: 0.0148 - val_acc: 0.9952
2018-03-27 12:39:47,970 [INFO] Evaluate...
2018-03-27 12:39:52,907 [INFO] Done!
2018-03-27 12:39:52,914 [INFO] tpe_transform took 0.002497 seconds
2018-03-27 12:39:52,914 [INFO] TPE using 84/84 trials with best loss 0.011121
2018-03-27 12:39:52,921 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:39:53,908 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0566 - acc: 0.9815 - val_loss: 0.0239 - val_acc: 0.9940
Epoch 2/200
 - 1s - loss: 0.0310 - acc: 0.9907 - val_loss: 0.0216 - val_acc: 0.9944
Epoch 3/200
 - 1s - loss: 0.0280 - acc: 0.9909 - val_loss: 0.0207 - val_acc: 0.9942
Epoch 4/200
 - 1s - loss: 0.0268 - acc: 0.9917 - val_loss: 0.0201 - val_acc: 0.9940
Epoch 5/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0258 - acc: 0.9923 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0248 - acc: 0.9926 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 8/200
 - 1s - loss: 0.0251 - acc: 0.9922 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 9/200
 - 1s - loss: 0.0245 - acc: 0.9928 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 10/200
 - 1s - loss: 0.0248 - acc: 0.9924 - val_loss: 0.0187 - val_acc: 0.9942
Epoch 11/200
 - 1s - loss: 0.0247 - acc: 0.9922 - val_loss: 0.0186 - val_acc: 0.9942
Epoch 12/200
 - 1s - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0239 - acc: 0.9928 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0244 - acc: 0.9930 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 17/200
 - 1s - loss: 0.0238 - acc: 0.9926 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0241 - acc: 0.9928 - val_loss: 0.0180 - val_acc: 0.9944
Epoch 19/200
 - 1s - loss: 0.0238 - acc: 0.9924 - val_loss: 0.0180 - val_acc: 0.9944
Epoch 20/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0231 - acc: 0.9925 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 23/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 24/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0230 - acc: 0.9925 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 26/200
 - 1s - loss: 0.0240 - acc: 0.9927 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 28/200
 - 1s - loss: 0.0238 - acc: 0.9921 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 29/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 31/200
 - 1s - loss: 0.0231 - acc: 0.9925 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 33/200
 - 1s - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 34/200
 - 1s - loss: 0.0234 - acc: 0.9927 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0231 - acc: 0.9930 - val_loss: 0.0175 - val_acc: 0.9946
Epoch 36/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 37/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 38/200
 - 1s - loss: 0.0231 - acc: 0.9928 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 39/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 40/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 41/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 42/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 44/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 45/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 47/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 48/200
 - 1s - loss: 0.0220 - acc: 0.9925 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 49/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 50/200
 - 1s - loss: 0.0228 - acc: 0.9925 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 51/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 52/200
 - 1s - loss: 0.0220 - acc: 0.9923 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 53/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 54/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 55/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 56/200
 - 1s - loss: 0.0229 - acc: 0.9920 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 57/200
 - 1s - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 58/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 59/200
 - 1s - loss: 0.0232 - acc: 0.9922 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 60/200
 - 1s - loss: 0.0209 - acc: 0.9932 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 61/200
 - 1s - loss: 0.0209 - acc: 0.9939 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 62/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 63/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 64/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 65/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 66/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 67/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 68/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0171 - val_acc: 0.9946
Epoch 69/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 70/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 71/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 72/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 73/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 74/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 75/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 76/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 77/200
 - 1s - loss: 0.0214 - acc: 0.9929 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 78/200
 - 1s - loss: 0.0226 - acc: 0.9923 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 79/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 80/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 81/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0170 - val_acc: 0.9946
Epoch 82/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 83/200
 - 1s - loss: 0.0216 - acc: 0.9929 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 84/200
 - 1s - loss: 0.0222 - acc: 0.9924 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 85/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 86/200
 - 1s - loss: 0.0219 - acc: 0.9930 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 87/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 88/200
 - 1s - loss: 0.0215 - acc: 0.9926 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 89/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 90/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 91/200
 - 1s - loss: 0.0220 - acc: 0.9930 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 92/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 93/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 94/200
 - 1s - loss: 0.0215 - acc: 0.9930 - val_loss: 0.0169 - val_acc: 0.9946
Epoch 95/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 96/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 97/200
 - 1s - loss: 0.0208 - acc: 0.9941 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 98/200
 - 1s - loss: 0.0204 - acc: 0.9931 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 99/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 100/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 101/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 102/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 103/200
 - 1s - loss: 0.0209 - acc: 0.9928 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 104/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 105/200
 - 1s - loss: 0.0219 - acc: 0.9928 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 106/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 107/200
 - 1s - loss: 0.0219 - acc: 0.9933 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 108/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 109/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 110/200
 - 1s - loss: 0.0209 - acc: 0.9930 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 111/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 112/200
 - 1s - loss: 0.0217 - acc: 0.9930 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 113/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 114/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 115/200
 - 1s - loss: 0.0202 - acc: 0.9942 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 116/200
 - 1s - loss: 0.0204 - acc: 0.9934 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 117/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 118/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0168 - val_acc: 0.9950
Epoch 119/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 120/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 121/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 122/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 123/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 124/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 125/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 126/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 127/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 128/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 129/200
 - 1s - loss: 0.0218 - acc: 0.9928 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 130/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 131/200
 - 1s - loss: 0.0211 - acc: 0.9934 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 132/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 133/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 134/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 135/200
 - 1s - loss: 0.0210 - acc: 0.9931 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 136/200
 - 1s - loss: 0.0209 - acc: 0.9930 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 137/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 138/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 139/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 140/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 141/200
 - 1s - loss: 0.0218 - acc: 0.9930 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 142/200
 - 1s - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 143/200
 - 1s - loss: 0.0212 - acc: 0.9939 - val_loss: 0.0167 - val_acc: 0.9948
Epoch 144/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 145/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0167 - val_acc: 0.9950
Epoch 146/200
 - 1s - loss: 0.0219 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 147/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 148/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 149/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 150/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0166 - val_acc: 0.9950
Epoch 151/200
 - 1s - loss: 0.0205 - acc: 0.9942 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 152/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 153/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 154/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 155/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 156/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 157/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 158/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 159/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 160/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 161/200
 - 1s - loss: 0.0209 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 162/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 163/200
 - 1s - loss: 0.0211 - acc: 0.9926 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 164/200
 - 1s - loss: 0.0210 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 165/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 166/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 167/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 168/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 169/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 170/200
 - 1s - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 171/200
 - 1s - loss: 0.0206 - acc: 0.9941 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 172/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 173/200
 - 1s - loss: 0.0208 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 174/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 175/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 176/200
 - 1s - loss: 0.0209 - acc: 0.9928 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 177/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 178/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0166 - val_acc: 0.9948
Epoch 179/200
 - 1s - loss: 0.0211 - acc: 0.9929 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 180/200
 - 1s - loss: 0.0206 - acc: 0.9932 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 181/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 182/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 183/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 184/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 185/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 186/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 187/200
 - 1s - loss: 0.0207 - acc: 0.9929 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 188/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 189/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 190/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 191/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 192/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 193/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 194/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 195/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 196/200
 - 1s - loss: 0.0216 - acc: 0.9930 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 197/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 198/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 199/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0165 - val_acc: 0.9948
Epoch 200/200
 - 1s - loss: 0.0211 - acc: 0.9930 - val_loss: 0.0165 - val_acc: 0.9948
2018-03-27 12:43:18,323 [INFO] Evaluate...
2018-03-27 12:43:23,344 [INFO] Done!
2018-03-27 12:43:23,351 [INFO] tpe_transform took 0.003197 seconds
2018-03-27 12:43:23,352 [INFO] TPE using 85/85 trials with best loss 0.011121
2018-03-27 12:43:23,360 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:43:24,348 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0521 - acc: 0.9816 - val_loss: 0.0263 - val_acc: 0.9920
Epoch 2/200
 - 1s - loss: 0.0266 - acc: 0.9919 - val_loss: 0.0247 - val_acc: 0.9924
Epoch 3/200
 - 1s - loss: 0.0249 - acc: 0.9923 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 4/200
 - 1s - loss: 0.0245 - acc: 0.9924 - val_loss: 0.0233 - val_acc: 0.9926
Epoch 5/200
 - 1s - loss: 0.0242 - acc: 0.9926 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 6/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0228 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0226 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0224 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0233 - acc: 0.9929 - val_loss: 0.0223 - val_acc: 0.9928
Epoch 10/200
 - 1s - loss: 0.0232 - acc: 0.9927 - val_loss: 0.0221 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0220 - val_acc: 0.9928
Epoch 12/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0220 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 14/200
 - 1s - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 15/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 16/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 17/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 18/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0216 - val_acc: 0.9930
Epoch 19/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0215 - val_acc: 0.9930
Epoch 20/200
 - 1s - loss: 0.0216 - acc: 0.9934 - val_loss: 0.0215 - val_acc: 0.9930
Epoch 21/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0214 - val_acc: 0.9930
Epoch 22/200
 - 1s - loss: 0.0217 - acc: 0.9931 - val_loss: 0.0214 - val_acc: 0.9930
Epoch 23/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9930
Epoch 24/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0213 - val_acc: 0.9930
Epoch 25/200
 - 1s - loss: 0.0213 - acc: 0.9936 - val_loss: 0.0213 - val_acc: 0.9930
Epoch 26/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0213 - val_acc: 0.9930
Epoch 27/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0212 - val_acc: 0.9930
Epoch 28/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0212 - val_acc: 0.9930
Epoch 29/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0212 - val_acc: 0.9930
Epoch 30/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9930
Epoch 31/200
 - 1s - loss: 0.0209 - acc: 0.9942 - val_loss: 0.0211 - val_acc: 0.9930
Epoch 32/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0211 - val_acc: 0.9930
Epoch 33/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0211 - val_acc: 0.9930
Epoch 34/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 35/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 36/200
 - 1s - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 37/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 38/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 39/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0210 - val_acc: 0.9932
Epoch 40/200
 - 1s - loss: 0.0207 - acc: 0.9939 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 41/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 42/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 43/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 44/200
 - 1s - loss: 0.0211 - acc: 0.9937 - val_loss: 0.0209 - val_acc: 0.9932
Epoch 45/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 46/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 47/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 48/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 49/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 50/200
 - 1s - loss: 0.0208 - acc: 0.9934 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 51/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 52/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0208 - val_acc: 0.9934
Epoch 53/200
 - 1s - loss: 0.0208 - acc: 0.9939 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 54/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 55/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 56/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 57/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 58/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 59/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 60/200
 - 1s - loss: 0.0208 - acc: 0.9933 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 61/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 62/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 63/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 64/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 65/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 66/200
 - 1s - loss: 0.0203 - acc: 0.9942 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 67/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 68/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0203 - acc: 0.9936 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 70/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 71/200
 - 1s - loss: 0.0207 - acc: 0.9941 - val_loss: 0.0206 - val_acc: 0.9934
Epoch 72/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 73/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 74/200
 - 1s - loss: 0.0206 - acc: 0.9939 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 75/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 76/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 77/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 78/200
 - 1s - loss: 0.0207 - acc: 0.9935 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 79/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 80/200
 - 1s - loss: 0.0205 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 81/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 82/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 83/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0205 - val_acc: 0.9934
Epoch 84/200
 - 1s - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 85/200
 - 1s - loss: 0.0201 - acc: 0.9942 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 86/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 87/200
 - 1s - loss: 0.0199 - acc: 0.9944 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 88/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 89/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 90/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 91/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 92/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 93/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 94/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 95/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 96/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 97/200
 - 1s - loss: 0.0203 - acc: 0.9939 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 98/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 99/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 100/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0204 - val_acc: 0.9934
Epoch 101/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 102/200
 - 1s - loss: 0.0204 - acc: 0.9938 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 103/200
 - 1s - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 104/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 105/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 106/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 107/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 108/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 109/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 110/200
 - 1s - loss: 0.0201 - acc: 0.9939 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 111/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 112/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 113/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 114/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 115/200
 - 1s - loss: 0.0200 - acc: 0.9943 - val_loss: 0.0203 - val_acc: 0.9934
Epoch 116/200
 - 1s - loss: 0.0200 - acc: 0.9943 - val_loss: 0.0203 - val_acc: 0.9936
Epoch 117/200
 - 1s - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0203 - val_acc: 0.9936
Epoch 118/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 119/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0199 - acc: 0.9942 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0199 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0200 - acc: 0.9939 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 127/200
 - 1s - loss: 0.0198 - acc: 0.9941 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 128/200
 - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 129/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 131/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 132/200
 - 1s - loss: 0.0202 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 133/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 134/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 135/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 136/200
 - 1s - loss: 0.0198 - acc: 0.9942 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 137/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 138/200
 - 1s - loss: 0.0201 - acc: 0.9944 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 139/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 140/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 141/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 142/200
 - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 143/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 144/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 145/200
 - 1s - loss: 0.0195 - acc: 0.9945 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 146/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 147/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 148/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 149/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 150/200
 - 1s - loss: 0.0199 - acc: 0.9939 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 151/200
 - 1s - loss: 0.0198 - acc: 0.9944 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 152/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 153/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 154/200
 - 1s - loss: 0.0197 - acc: 0.9941 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 155/200
 - 1s - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 156/200
 - 1s - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 157/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 158/200
 - 1s - loss: 0.0196 - acc: 0.9943 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 159/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 160/200
 - 1s - loss: 0.0200 - acc: 0.9941 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 161/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 162/200
 - 1s - loss: 0.0195 - acc: 0.9945 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 163/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 164/200
 - 1s - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 165/200
 - 1s - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 166/200
 - 1s - loss: 0.0193 - acc: 0.9945 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 167/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 168/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 169/200
 - 1s - loss: 0.0198 - acc: 0.9938 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 170/200
 - 1s - loss: 0.0195 - acc: 0.9939 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 171/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 172/200
 - 1s - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 173/200
 - 1s - loss: 0.0199 - acc: 0.9935 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 174/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 175/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 176/200
 - 1s - loss: 0.0196 - acc: 0.9944 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 177/200
 - 1s - loss: 0.0197 - acc: 0.9942 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 178/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 179/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 180/200
 - 1s - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 181/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 182/200
 - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 183/200
 - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 184/200
 - 1s - loss: 0.0198 - acc: 0.9939 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 185/200
 - 1s - loss: 0.0196 - acc: 0.9935 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 186/200
 - 1s - loss: 0.0194 - acc: 0.9942 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 187/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 188/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 189/200
 - 1s - loss: 0.0196 - acc: 0.9944 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 191/200
 - 1s - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 192/200
 - 1s - loss: 0.0198 - acc: 0.9942 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 193/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 194/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 195/200
 - 1s - loss: 0.0199 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 196/200
 - 1s - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 197/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 198/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 199/200
 - 1s - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0200 - val_acc: 0.9938
Epoch 200/200
 - 1s - loss: 0.0196 - acc: 0.9942 - val_loss: 0.0200 - val_acc: 0.9938
2018-03-27 12:46:48,983 [INFO] Evaluate...
2018-03-27 12:46:54,023 [INFO] Done!
2018-03-27 12:46:54,030 [INFO] tpe_transform took 0.002486 seconds
2018-03-27 12:46:54,030 [INFO] TPE using 86/86 trials with best loss 0.011121
2018-03-27 12:46:54,039 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:46:55,025 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0866 - acc: 0.9678 - val_loss: 0.0366 - val_acc: 0.9904
Epoch 2/200
 - 1s - loss: 0.0418 - acc: 0.9864 - val_loss: 0.0331 - val_acc: 0.9908
Epoch 3/200
 - 1s - loss: 0.0383 - acc: 0.9883 - val_loss: 0.0316 - val_acc: 0.9910
Epoch 4/200
 - 1s - loss: 0.0369 - acc: 0.9888 - val_loss: 0.0306 - val_acc: 0.9908
Epoch 5/200
 - 1s - loss: 0.0368 - acc: 0.9890 - val_loss: 0.0299 - val_acc: 0.9914
Epoch 6/200
 - 1s - loss: 0.0353 - acc: 0.9890 - val_loss: 0.0295 - val_acc: 0.9912
Epoch 7/200
 - 1s - loss: 0.0347 - acc: 0.9887 - val_loss: 0.0291 - val_acc: 0.9912
Epoch 8/200
 - 1s - loss: 0.0334 - acc: 0.9904 - val_loss: 0.0287 - val_acc: 0.9918
Epoch 9/200
 - 1s - loss: 0.0339 - acc: 0.9899 - val_loss: 0.0285 - val_acc: 0.9914
Epoch 10/200
 - 1s - loss: 0.0322 - acc: 0.9909 - val_loss: 0.0283 - val_acc: 0.9914
Epoch 11/200
 - 1s - loss: 0.0331 - acc: 0.9897 - val_loss: 0.0281 - val_acc: 0.9918
Epoch 12/200
 - 1s - loss: 0.0333 - acc: 0.9895 - val_loss: 0.0279 - val_acc: 0.9918
Epoch 13/200
 - 1s - loss: 0.0323 - acc: 0.9900 - val_loss: 0.0277 - val_acc: 0.9918
Epoch 14/200
 - 1s - loss: 0.0329 - acc: 0.9896 - val_loss: 0.0276 - val_acc: 0.9918
Epoch 15/200
 - 1s - loss: 0.0320 - acc: 0.9896 - val_loss: 0.0274 - val_acc: 0.9918
Epoch 16/200
 - 1s - loss: 0.0313 - acc: 0.9909 - val_loss: 0.0273 - val_acc: 0.9920
Epoch 17/200
 - 1s - loss: 0.0310 - acc: 0.9910 - val_loss: 0.0272 - val_acc: 0.9920
Epoch 18/200
 - 1s - loss: 0.0316 - acc: 0.9899 - val_loss: 0.0271 - val_acc: 0.9920
Epoch 19/200
 - 1s - loss: 0.0318 - acc: 0.9901 - val_loss: 0.0270 - val_acc: 0.9920
Epoch 20/200
 - 1s - loss: 0.0319 - acc: 0.9893 - val_loss: 0.0269 - val_acc: 0.9920
Epoch 21/200
 - 1s - loss: 0.0306 - acc: 0.9905 - val_loss: 0.0269 - val_acc: 0.9918
Epoch 22/200
 - 1s - loss: 0.0311 - acc: 0.9901 - val_loss: 0.0268 - val_acc: 0.9920
Epoch 23/200
 - 1s - loss: 0.0315 - acc: 0.9904 - val_loss: 0.0267 - val_acc: 0.9920
Epoch 24/200
 - 1s - loss: 0.0302 - acc: 0.9903 - val_loss: 0.0266 - val_acc: 0.9920
Epoch 25/200
 - 1s - loss: 0.0314 - acc: 0.9896 - val_loss: 0.0266 - val_acc: 0.9920
Epoch 26/200
 - 1s - loss: 0.0306 - acc: 0.9904 - val_loss: 0.0265 - val_acc: 0.9918
Epoch 27/200
 - 1s - loss: 0.0301 - acc: 0.9908 - val_loss: 0.0265 - val_acc: 0.9920
Epoch 28/200
 - 1s - loss: 0.0304 - acc: 0.9904 - val_loss: 0.0264 - val_acc: 0.9920
Epoch 29/200
 - 1s - loss: 0.0300 - acc: 0.9911 - val_loss: 0.0264 - val_acc: 0.9920
Epoch 30/200
 - 1s - loss: 0.0305 - acc: 0.9909 - val_loss: 0.0263 - val_acc: 0.9920
Epoch 31/200
 - 1s - loss: 0.0294 - acc: 0.9913 - val_loss: 0.0263 - val_acc: 0.9920
Epoch 32/200
 - 1s - loss: 0.0312 - acc: 0.9896 - val_loss: 0.0262 - val_acc: 0.9920
Epoch 33/200
 - 1s - loss: 0.0292 - acc: 0.9912 - val_loss: 0.0262 - val_acc: 0.9920
Epoch 34/200
 - 1s - loss: 0.0302 - acc: 0.9905 - val_loss: 0.0261 - val_acc: 0.9920
Epoch 35/200
 - 1s - loss: 0.0296 - acc: 0.9906 - val_loss: 0.0261 - val_acc: 0.9920
Epoch 36/200
 - 1s - loss: 0.0295 - acc: 0.9910 - val_loss: 0.0261 - val_acc: 0.9920
Epoch 37/200
 - 1s - loss: 0.0297 - acc: 0.9912 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 38/200
 - 1s - loss: 0.0299 - acc: 0.9899 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 39/200
 - 1s - loss: 0.0296 - acc: 0.9908 - val_loss: 0.0260 - val_acc: 0.9920
Epoch 40/200
 - 1s - loss: 0.0295 - acc: 0.9906 - val_loss: 0.0259 - val_acc: 0.9920
Epoch 41/200
 - 1s - loss: 0.0306 - acc: 0.9901 - val_loss: 0.0259 - val_acc: 0.9920
Epoch 42/200
 - 1s - loss: 0.0298 - acc: 0.9914 - val_loss: 0.0259 - val_acc: 0.9920
Epoch 43/200
 - 1s - loss: 0.0300 - acc: 0.9905 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 44/200
 - 1s - loss: 0.0293 - acc: 0.9910 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 45/200
 - 1s - loss: 0.0293 - acc: 0.9910 - val_loss: 0.0258 - val_acc: 0.9920
Epoch 46/200
 - 1s - loss: 0.0299 - acc: 0.9896 - val_loss: 0.0257 - val_acc: 0.9920
Epoch 47/200
 - 1s - loss: 0.0296 - acc: 0.9906 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 48/200
 - 1s - loss: 0.0295 - acc: 0.9910 - val_loss: 0.0257 - val_acc: 0.9924
Epoch 49/200
 - 1s - loss: 0.0292 - acc: 0.9914 - val_loss: 0.0257 - val_acc: 0.9922
Epoch 50/200
 - 1s - loss: 0.0290 - acc: 0.9905 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 51/200
 - 1s - loss: 0.0282 - acc: 0.9910 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 52/200
 - 1s - loss: 0.0289 - acc: 0.9909 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 53/200
 - 1s - loss: 0.0287 - acc: 0.9909 - val_loss: 0.0256 - val_acc: 0.9920
Epoch 54/200
 - 1s - loss: 0.0280 - acc: 0.9912 - val_loss: 0.0255 - val_acc: 0.9922
Epoch 55/200
 - 1s - loss: 0.0286 - acc: 0.9909 - val_loss: 0.0255 - val_acc: 0.9922
Epoch 56/200
 - 1s - loss: 0.0291 - acc: 0.9913 - val_loss: 0.0255 - val_acc: 0.9922
Epoch 57/200
 - 1s - loss: 0.0289 - acc: 0.9907 - val_loss: 0.0255 - val_acc: 0.9924
Epoch 58/200
 - 1s - loss: 0.0285 - acc: 0.9913 - val_loss: 0.0255 - val_acc: 0.9922
Epoch 59/200
 - 1s - loss: 0.0283 - acc: 0.9913 - val_loss: 0.0254 - val_acc: 0.9922
Epoch 60/200
 - 1s - loss: 0.0293 - acc: 0.9905 - val_loss: 0.0254 - val_acc: 0.9922
Epoch 61/200
 - 1s - loss: 0.0289 - acc: 0.9911 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 62/200
 - 1s - loss: 0.0287 - acc: 0.9902 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 63/200
 - 1s - loss: 0.0292 - acc: 0.9909 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 64/200
 - 1s - loss: 0.0290 - acc: 0.9910 - val_loss: 0.0254 - val_acc: 0.9924
Epoch 65/200
 - 1s - loss: 0.0287 - acc: 0.9915 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 66/200
 - 1s - loss: 0.0283 - acc: 0.9909 - val_loss: 0.0253 - val_acc: 0.9924
Epoch 67/200
 - 1s - loss: 0.0288 - acc: 0.9904 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 68/200
 - 1s - loss: 0.0285 - acc: 0.9909 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 69/200
 - 1s - loss: 0.0300 - acc: 0.9908 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 70/200
 - 1s - loss: 0.0286 - acc: 0.9911 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 71/200
 - 1s - loss: 0.0295 - acc: 0.9903 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 72/200
 - 1s - loss: 0.0281 - acc: 0.9917 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 73/200
 - 1s - loss: 0.0280 - acc: 0.9921 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 74/200
 - 1s - loss: 0.0284 - acc: 0.9916 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 75/200
 - 1s - loss: 0.0278 - acc: 0.9906 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 76/200
 - 1s - loss: 0.0299 - acc: 0.9909 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 77/200
 - 1s - loss: 0.0280 - acc: 0.9914 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 78/200
 - 1s - loss: 0.0280 - acc: 0.9911 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 79/200
 - 1s - loss: 0.0279 - acc: 0.9920 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 80/200
 - 1s - loss: 0.0280 - acc: 0.9913 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 81/200
 - 1s - loss: 0.0288 - acc: 0.9914 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 82/200
 - 1s - loss: 0.0295 - acc: 0.9904 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 83/200
 - 1s - loss: 0.0280 - acc: 0.9911 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 84/200
 - 1s - loss: 0.0289 - acc: 0.9908 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 85/200
 - 1s - loss: 0.0284 - acc: 0.9913 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 86/200
 - 1s - loss: 0.0288 - acc: 0.9909 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 87/200
 - 1s - loss: 0.0273 - acc: 0.9917 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 88/200
 - 1s - loss: 0.0275 - acc: 0.9922 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 89/200
 - 1s - loss: 0.0276 - acc: 0.9914 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 90/200
 - 1s - loss: 0.0286 - acc: 0.9910 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 91/200
 - 1s - loss: 0.0272 - acc: 0.9914 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 92/200
 - 1s - loss: 0.0269 - acc: 0.9920 - val_loss: 0.0250 - val_acc: 0.9926
Epoch 93/200
 - 1s - loss: 0.0286 - acc: 0.9916 - val_loss: 0.0250 - val_acc: 0.9928
Epoch 94/200
 - 1s - loss: 0.0288 - acc: 0.9908 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 95/200
 - 1s - loss: 0.0279 - acc: 0.9915 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 96/200
 - 1s - loss: 0.0277 - acc: 0.9918 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 97/200
 - 1s - loss: 0.0287 - acc: 0.9905 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 98/200
 - 1s - loss: 0.0294 - acc: 0.9908 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 99/200
 - 1s - loss: 0.0285 - acc: 0.9913 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 100/200
 - 1s - loss: 0.0279 - acc: 0.9914 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 101/200
 - 1s - loss: 0.0277 - acc: 0.9909 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 102/200
 - 1s - loss: 0.0284 - acc: 0.9909 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 103/200
 - 1s - loss: 0.0281 - acc: 0.9908 - val_loss: 0.0249 - val_acc: 0.9928
Epoch 104/200
 - 1s - loss: 0.0286 - acc: 0.9909 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 105/200
 - 1s - loss: 0.0281 - acc: 0.9911 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 106/200
 - 1s - loss: 0.0269 - acc: 0.9916 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 107/200
 - 1s - loss: 0.0272 - acc: 0.9914 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 108/200
 - 1s - loss: 0.0281 - acc: 0.9914 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 109/200
 - 1s - loss: 0.0280 - acc: 0.9914 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 110/200
 - 1s - loss: 0.0286 - acc: 0.9909 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 111/200
 - 1s - loss: 0.0264 - acc: 0.9917 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 112/200
 - 1s - loss: 0.0278 - acc: 0.9913 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 113/200
 - 1s - loss: 0.0275 - acc: 0.9909 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 114/200
 - 1s - loss: 0.0280 - acc: 0.9913 - val_loss: 0.0248 - val_acc: 0.9928
Epoch 115/200
 - 1s - loss: 0.0279 - acc: 0.9914 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 116/200
 - 1s - loss: 0.0278 - acc: 0.9913 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 117/200
 - 1s - loss: 0.0283 - acc: 0.9913 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 118/200
 - 1s - loss: 0.0282 - acc: 0.9910 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 119/200
 - 1s - loss: 0.0274 - acc: 0.9908 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 120/200
 - 1s - loss: 0.0280 - acc: 0.9909 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 121/200
 - 1s - loss: 0.0281 - acc: 0.9909 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 122/200
 - 1s - loss: 0.0266 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 123/200
 - 1s - loss: 0.0279 - acc: 0.9915 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 124/200
 - 1s - loss: 0.0285 - acc: 0.9917 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 125/200
 - 1s - loss: 0.0266 - acc: 0.9919 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 126/200
 - 1s - loss: 0.0274 - acc: 0.9917 - val_loss: 0.0247 - val_acc: 0.9928
Epoch 127/200
 - 1s - loss: 0.0269 - acc: 0.9915 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 128/200
 - 1s - loss: 0.0279 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 129/200
 - 1s - loss: 0.0274 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 130/200
 - 1s - loss: 0.0272 - acc: 0.9915 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 131/200
 - 1s - loss: 0.0278 - acc: 0.9923 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 132/200
 - 1s - loss: 0.0279 - acc: 0.9911 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 133/200
 - 1s - loss: 0.0277 - acc: 0.9914 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 134/200
 - 1s - loss: 0.0270 - acc: 0.9915 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 135/200
 - 1s - loss: 0.0286 - acc: 0.9911 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 136/200
 - 1s - loss: 0.0270 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 137/200
 - 1s - loss: 0.0272 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 138/200
 - 1s - loss: 0.0275 - acc: 0.9913 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 139/200
 - 1s - loss: 0.0268 - acc: 0.9919 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 140/200
 - 1s - loss: 0.0277 - acc: 0.9917 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 141/200
 - 1s - loss: 0.0277 - acc: 0.9909 - val_loss: 0.0246 - val_acc: 0.9928
Epoch 142/200
 - 1s - loss: 0.0280 - acc: 0.9906 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 143/200
 - 1s - loss: 0.0267 - acc: 0.9918 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 144/200
 - 1s - loss: 0.0272 - acc: 0.9918 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 145/200
 - 1s - loss: 0.0268 - acc: 0.9913 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 146/200
 - 1s - loss: 0.0286 - acc: 0.9909 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 147/200
 - 1s - loss: 0.0276 - acc: 0.9911 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 148/200
 - 1s - loss: 0.0265 - acc: 0.9917 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 149/200
 - 1s - loss: 0.0266 - acc: 0.9918 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 150/200
 - 1s - loss: 0.0271 - acc: 0.9918 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 151/200
 - 1s - loss: 0.0274 - acc: 0.9914 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 152/200
 - 1s - loss: 0.0274 - acc: 0.9909 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 153/200
 - 1s - loss: 0.0275 - acc: 0.9913 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 154/200
 - 1s - loss: 0.0269 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 155/200
 - 1s - loss: 0.0264 - acc: 0.9928 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 156/200
 - 1s - loss: 0.0276 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 157/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 158/200
 - 1s - loss: 0.0277 - acc: 0.9913 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 159/200
 - 1s - loss: 0.0268 - acc: 0.9919 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 160/200
 - 1s - loss: 0.0278 - acc: 0.9917 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 161/200
 - 1s - loss: 0.0282 - acc: 0.9905 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 162/200
 - 1s - loss: 0.0271 - acc: 0.9918 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 163/200
 - 1s - loss: 0.0274 - acc: 0.9917 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 164/200
 - 1s - loss: 0.0268 - acc: 0.9918 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 165/200
 - 1s - loss: 0.0276 - acc: 0.9915 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 166/200
 - 1s - loss: 0.0281 - acc: 0.9910 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 167/200
 - 1s - loss: 0.0278 - acc: 0.9909 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 168/200
 - 1s - loss: 0.0281 - acc: 0.9906 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 169/200
 - 1s - loss: 0.0271 - acc: 0.9913 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 170/200
 - 1s - loss: 0.0272 - acc: 0.9915 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 171/200
 - 1s - loss: 0.0268 - acc: 0.9911 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 172/200
 - 1s - loss: 0.0268 - acc: 0.9913 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 173/200
 - 1s - loss: 0.0273 - acc: 0.9915 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 174/200
 - 1s - loss: 0.0266 - acc: 0.9917 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 175/200
 - 1s - loss: 0.0263 - acc: 0.9916 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 176/200
 - 1s - loss: 0.0267 - acc: 0.9913 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 177/200
 - 1s - loss: 0.0278 - acc: 0.9908 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 178/200
 - 1s - loss: 0.0272 - acc: 0.9918 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 179/200
 - 1s - loss: 0.0282 - acc: 0.9911 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 180/200
 - 1s - loss: 0.0278 - acc: 0.9910 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 181/200
 - 1s - loss: 0.0270 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 182/200
 - 1s - loss: 0.0280 - acc: 0.9921 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 183/200
 - 1s - loss: 0.0263 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 184/200
 - 1s - loss: 0.0268 - acc: 0.9917 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 185/200
 - 1s - loss: 0.0269 - acc: 0.9922 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 186/200
 - 1s - loss: 0.0270 - acc: 0.9918 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 187/200
 - 1s - loss: 0.0268 - acc: 0.9913 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 188/200
 - 1s - loss: 0.0271 - acc: 0.9914 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 189/200
 - 1s - loss: 0.0275 - acc: 0.9911 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 190/200
 - 1s - loss: 0.0270 - acc: 0.9914 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 191/200
 - 1s - loss: 0.0260 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 192/200
 - 1s - loss: 0.0255 - acc: 0.9917 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 193/200
 - 1s - loss: 0.0267 - acc: 0.9911 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 194/200
 - 1s - loss: 0.0253 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 195/200
 - 1s - loss: 0.0269 - acc: 0.9914 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 196/200
 - 1s - loss: 0.0264 - acc: 0.9919 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 197/200
 - 1s - loss: 0.0269 - acc: 0.9920 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 198/200
 - 1s - loss: 0.0270 - acc: 0.9914 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 199/200
 - 1s - loss: 0.0275 - acc: 0.9915 - val_loss: 0.0243 - val_acc: 0.9928
Epoch 200/200
 - 1s - loss: 0.0267 - acc: 0.9919 - val_loss: 0.0242 - val_acc: 0.9928
2018-03-27 12:50:19,805 [INFO] Evaluate...
2018-03-27 12:50:24,944 [INFO] Done!
2018-03-27 12:50:24,952 [INFO] tpe_transform took 0.002465 seconds
2018-03-27 12:50:24,952 [INFO] TPE using 87/87 trials with best loss 0.011121
2018-03-27 12:50:24,960 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:50:25,948 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0427 - acc: 0.9851 - val_loss: 0.0217 - val_acc: 0.9926
Epoch 2/200
 - 1s - loss: 0.0204 - acc: 0.9928 - val_loss: 0.0203 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0197 - acc: 0.9932 - val_loss: 0.0198 - val_acc: 0.9944
Epoch 4/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9942
Epoch 5/200
 - 1s - loss: 0.0181 - acc: 0.9939 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 6/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0188 - val_acc: 0.9948
Epoch 8/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9948
Epoch 9/200
 - 1s - loss: 0.0167 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 10/200
 - 1s - loss: 0.0165 - acc: 0.9945 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 11/200
 - 1s - loss: 0.0162 - acc: 0.9946 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0183 - val_acc: 0.9946
Epoch 13/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 14/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 15/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 17/200
 - 1s - loss: 0.0153 - acc: 0.9955 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 18/200
 - 1s - loss: 0.0157 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0155 - acc: 0.9946 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 21/200
 - 1s - loss: 0.0155 - acc: 0.9947 - val_loss: 0.0179 - val_acc: 0.9948
Epoch 22/200
 - 1s - loss: 0.0162 - acc: 0.9944 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 23/200
 - 1s - loss: 0.0152 - acc: 0.9953 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 24/200
 - 1s - loss: 0.0153 - acc: 0.9952 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0150 - acc: 0.9954 - val_loss: 0.0178 - val_acc: 0.9950
Epoch 26/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 27/200
 - 1s - loss: 0.0153 - acc: 0.9952 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 28/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9950
Epoch 29/200
 - 1s - loss: 0.0151 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 30/200
 - 1s - loss: 0.0149 - acc: 0.9952 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 31/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 32/200
 - 1s - loss: 0.0153 - acc: 0.9948 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 33/200
 - 1s - loss: 0.0150 - acc: 0.9951 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 34/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 35/200
 - 1s - loss: 0.0151 - acc: 0.9953 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 36/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0176 - val_acc: 0.9950
Epoch 37/200
 - 1s - loss: 0.0149 - acc: 0.9947 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 38/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 39/200
 - 1s - loss: 0.0149 - acc: 0.9948 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 40/200
 - 1s - loss: 0.0146 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 41/200
 - 1s - loss: 0.0146 - acc: 0.9953 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 42/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0175 - val_acc: 0.9950
Epoch 43/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 44/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 45/200
 - 1s - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0143 - acc: 0.9952 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 47/200
 - 1s - loss: 0.0148 - acc: 0.9952 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 48/200
 - 1s - loss: 0.0139 - acc: 0.9960 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 49/200
 - 1s - loss: 0.0149 - acc: 0.9949 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 50/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0174 - val_acc: 0.9948
Epoch 51/200
 - 1s - loss: 0.0149 - acc: 0.9956 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 52/200
 - 1s - loss: 0.0148 - acc: 0.9950 - val_loss: 0.0174 - val_acc: 0.9950
Epoch 53/200
 - 1s - loss: 0.0146 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 54/200
 - 1s - loss: 0.0141 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 55/200
 - 1s - loss: 0.0145 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 56/200
 - 1s - loss: 0.0148 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 57/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 58/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 59/200
 - 1s - loss: 0.0140 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 60/200
 - 1s - loss: 0.0142 - acc: 0.9956 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 61/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 62/200
 - 1s - loss: 0.0141 - acc: 0.9951 - val_loss: 0.0173 - val_acc: 0.9948
Epoch 63/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0173 - val_acc: 0.9950
Epoch 64/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 65/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 66/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 67/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 68/200
 - 1s - loss: 0.0136 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 69/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 70/200
 - 1s - loss: 0.0141 - acc: 0.9959 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 71/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 72/200
 - 1s - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 73/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 74/200
 - 1s - loss: 0.0139 - acc: 0.9957 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 75/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 76/200
 - 1s - loss: 0.0138 - acc: 0.9954 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 77/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0172 - val_acc: 0.9948
Epoch 78/200
 - 1s - loss: 0.0145 - acc: 0.9954 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 79/200
 - 1s - loss: 0.0140 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 80/200
 - 1s - loss: 0.0141 - acc: 0.9957 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 81/200
 - 1s - loss: 0.0139 - acc: 0.9953 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 82/200
 - 1s - loss: 0.0140 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 83/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 84/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 85/200
 - 1s - loss: 0.0141 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 86/200
 - 1s - loss: 0.0143 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 87/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 88/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 89/200
 - 1s - loss: 0.0136 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 90/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 91/200
 - 1s - loss: 0.0142 - acc: 0.9954 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 92/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 93/200
 - 1s - loss: 0.0132 - acc: 0.9961 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 94/200
 - 1s - loss: 0.0138 - acc: 0.9959 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 95/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 96/200
 - 1s - loss: 0.0138 - acc: 0.9956 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 97/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 98/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 99/200
 - 1s - loss: 0.0139 - acc: 0.9954 - val_loss: 0.0171 - val_acc: 0.9948
Epoch 100/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 101/200
 - 1s - loss: 0.0140 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 102/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 103/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 104/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 105/200
 - 1s - loss: 0.0141 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 106/200
 - 1s - loss: 0.0142 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 107/200
 - 1s - loss: 0.0140 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 108/200
 - 1s - loss: 0.0143 - acc: 0.9950 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 109/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 110/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 111/200
 - 1s - loss: 0.0138 - acc: 0.9961 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 112/200
 - 1s - loss: 0.0135 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 113/200
 - 1s - loss: 0.0141 - acc: 0.9956 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 114/200
 - 1s - loss: 0.0137 - acc: 0.9960 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 115/200
 - 1s - loss: 0.0139 - acc: 0.9959 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 116/200
 - 1s - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 117/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 118/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 119/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 120/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 121/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0170 - val_acc: 0.9948
Epoch 122/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 123/200
 - 1s - loss: 0.0133 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 124/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 125/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 126/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 127/200
 - 1s - loss: 0.0138 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 128/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 129/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 130/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 131/200
 - 1s - loss: 0.0141 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 132/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 133/200
 - 1s - loss: 0.0136 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 134/200
 - 1s - loss: 0.0136 - acc: 0.9961 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 135/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 136/200
 - 1s - loss: 0.0135 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 137/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 138/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 139/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 140/200
 - 1s - loss: 0.0138 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 141/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 142/200
 - 1s - loss: 0.0138 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 143/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 144/200
 - 1s - loss: 0.0135 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 145/200
 - 1s - loss: 0.0134 - acc: 0.9957 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 146/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 147/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 148/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 149/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 150/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 151/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 152/200
 - 1s - loss: 0.0138 - acc: 0.9955 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 153/200
 - 1s - loss: 0.0137 - acc: 0.9956 - val_loss: 0.0169 - val_acc: 0.9948
Epoch 154/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 155/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 156/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 157/200
 - 1s - loss: 0.0137 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 158/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 159/200
 - 1s - loss: 0.0134 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 160/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 161/200
 - 1s - loss: 0.0132 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 162/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 163/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 164/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 165/200
 - 1s - loss: 0.0135 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 166/200
 - 1s - loss: 0.0130 - acc: 0.9963 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 167/200
 - 1s - loss: 0.0131 - acc: 0.9962 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 168/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 169/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 170/200
 - 1s - loss: 0.0132 - acc: 0.9958 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 171/200
 - 1s - loss: 0.0134 - acc: 0.9957 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 172/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 173/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 174/200
 - 1s - loss: 0.0131 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 175/200
 - 1s - loss: 0.0129 - acc: 0.9962 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 176/200
 - 1s - loss: 0.0132 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 177/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 178/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 179/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 180/200
 - 1s - loss: 0.0128 - acc: 0.9963 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 181/200
 - 1s - loss: 0.0132 - acc: 0.9963 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 182/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 183/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 184/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 185/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 186/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 187/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 188/200
 - 1s - loss: 0.0130 - acc: 0.9962 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 189/200
 - 1s - loss: 0.0135 - acc: 0.9955 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 190/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 191/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 192/200
 - 1s - loss: 0.0133 - acc: 0.9962 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 193/200
 - 1s - loss: 0.0131 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 194/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 195/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 196/200
 - 1s - loss: 0.0136 - acc: 0.9958 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 197/200
 - 1s - loss: 0.0135 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 198/200
 - 1s - loss: 0.0133 - acc: 0.9961 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 199/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0168 - val_acc: 0.9948
Epoch 200/200
 - 1s - loss: 0.0134 - acc: 0.9959 - val_loss: 0.0168 - val_acc: 0.9948
2018-03-27 12:53:51,299 [INFO] Evaluate...
2018-03-27 12:53:56,423 [INFO] Done!
2018-03-27 12:53:56,430 [INFO] tpe_transform took 0.003289 seconds
2018-03-27 12:53:56,431 [INFO] TPE using 88/88 trials with best loss 0.011121
2018-03-27 12:53:56,440 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:53:57,427 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0587 - acc: 0.9778 - val_loss: 0.0286 - val_acc: 0.9920
Epoch 2/200
 - 1s - loss: 0.0313 - acc: 0.9913 - val_loss: 0.0258 - val_acc: 0.9926
Epoch 3/200
 - 1s - loss: 0.0288 - acc: 0.9914 - val_loss: 0.0248 - val_acc: 0.9926
Epoch 4/200
 - 1s - loss: 0.0282 - acc: 0.9919 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 5/200
 - 1s - loss: 0.0264 - acc: 0.9914 - val_loss: 0.0235 - val_acc: 0.9926
Epoch 6/200
 - 1s - loss: 0.0274 - acc: 0.9913 - val_loss: 0.0230 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0263 - acc: 0.9920 - val_loss: 0.0227 - val_acc: 0.9928
Epoch 8/200
 - 1s - loss: 0.0258 - acc: 0.9926 - val_loss: 0.0225 - val_acc: 0.9928
Epoch 9/200
 - 1s - loss: 0.0253 - acc: 0.9926 - val_loss: 0.0223 - val_acc: 0.9928
Epoch 10/200
 - 1s - loss: 0.0251 - acc: 0.9920 - val_loss: 0.0221 - val_acc: 0.9928
Epoch 11/200
 - 1s - loss: 0.0253 - acc: 0.9925 - val_loss: 0.0219 - val_acc: 0.9930
Epoch 12/200
 - 1s - loss: 0.0255 - acc: 0.9921 - val_loss: 0.0218 - val_acc: 0.9930
Epoch 13/200
 - 1s - loss: 0.0246 - acc: 0.9929 - val_loss: 0.0217 - val_acc: 0.9930
Epoch 14/200
 - 1s - loss: 0.0244 - acc: 0.9930 - val_loss: 0.0216 - val_acc: 0.9934
Epoch 15/200
 - 1s - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0214 - val_acc: 0.9934
Epoch 16/200
 - 1s - loss: 0.0245 - acc: 0.9923 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 17/200
 - 1s - loss: 0.0243 - acc: 0.9929 - val_loss: 0.0213 - val_acc: 0.9934
Epoch 18/200
 - 1s - loss: 0.0246 - acc: 0.9926 - val_loss: 0.0212 - val_acc: 0.9934
Epoch 19/200
 - 1s - loss: 0.0239 - acc: 0.9924 - val_loss: 0.0211 - val_acc: 0.9934
Epoch 20/200
 - 1s - loss: 0.0248 - acc: 0.9922 - val_loss: 0.0210 - val_acc: 0.9936
Epoch 21/200
 - 1s - loss: 0.0241 - acc: 0.9922 - val_loss: 0.0210 - val_acc: 0.9936
Epoch 22/200
 - 1s - loss: 0.0237 - acc: 0.9925 - val_loss: 0.0209 - val_acc: 0.9936
Epoch 23/200
 - 1s - loss: 0.0241 - acc: 0.9926 - val_loss: 0.0209 - val_acc: 0.9936
Epoch 24/200
 - 1s - loss: 0.0239 - acc: 0.9929 - val_loss: 0.0208 - val_acc: 0.9936
Epoch 25/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0208 - val_acc: 0.9936
Epoch 26/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0207 - val_acc: 0.9938
Epoch 27/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0207 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0240 - acc: 0.9933 - val_loss: 0.0206 - val_acc: 0.9938
Epoch 29/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0206 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0229 - acc: 0.9928 - val_loss: 0.0205 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0237 - acc: 0.9927 - val_loss: 0.0205 - val_acc: 0.9938
Epoch 32/200
 - 1s - loss: 0.0234 - acc: 0.9927 - val_loss: 0.0205 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0236 - acc: 0.9927 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 34/200
 - 1s - loss: 0.0233 - acc: 0.9928 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 35/200
 - 1s - loss: 0.0234 - acc: 0.9924 - val_loss: 0.0204 - val_acc: 0.9938
Epoch 36/200
 - 1s - loss: 0.0233 - acc: 0.9930 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0228 - acc: 0.9927 - val_loss: 0.0203 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0235 - acc: 0.9927 - val_loss: 0.0202 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0202 - val_acc: 0.9936
Epoch 41/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0202 - val_acc: 0.9936
Epoch 42/200
 - 1s - loss: 0.0229 - acc: 0.9933 - val_loss: 0.0202 - val_acc: 0.9936
Epoch 43/200
 - 1s - loss: 0.0229 - acc: 0.9932 - val_loss: 0.0201 - val_acc: 0.9936
Epoch 44/200
 - 1s - loss: 0.0231 - acc: 0.9926 - val_loss: 0.0201 - val_acc: 0.9936
Epoch 45/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0201 - val_acc: 0.9936
Epoch 46/200
 - 1s - loss: 0.0228 - acc: 0.9929 - val_loss: 0.0201 - val_acc: 0.9936
Epoch 47/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 48/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 49/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 50/200
 - 1s - loss: 0.0237 - acc: 0.9930 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 51/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0200 - val_acc: 0.9936
Epoch 52/200
 - 1s - loss: 0.0234 - acc: 0.9926 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 53/200
 - 1s - loss: 0.0224 - acc: 0.9932 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 54/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 55/200
 - 1s - loss: 0.0228 - acc: 0.9930 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 56/200
 - 1s - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 57/200
 - 1s - loss: 0.0225 - acc: 0.9930 - val_loss: 0.0199 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 59/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 60/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 61/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 62/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0198 - val_acc: 0.9936
Epoch 63/200
 - 1s - loss: 0.0229 - acc: 0.9930 - val_loss: 0.0197 - val_acc: 0.9936
Epoch 64/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0197 - val_acc: 0.9936
Epoch 65/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9936
Epoch 66/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0197 - val_acc: 0.9936
Epoch 67/200
 - 1s - loss: 0.0223 - acc: 0.9930 - val_loss: 0.0197 - val_acc: 0.9936
Epoch 68/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 69/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0197 - val_acc: 0.9934
Epoch 70/200
 - 1s - loss: 0.0229 - acc: 0.9926 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 71/200
 - 1s - loss: 0.0226 - acc: 0.9927 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 72/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 73/200
 - 1s - loss: 0.0226 - acc: 0.9931 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 74/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 75/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 76/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 77/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 78/200
 - 1s - loss: 0.0226 - acc: 0.9928 - val_loss: 0.0196 - val_acc: 0.9934
Epoch 79/200
 - 1s - loss: 0.0224 - acc: 0.9929 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 80/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 81/200
 - 1s - loss: 0.0213 - acc: 0.9938 - val_loss: 0.0195 - val_acc: 0.9934
Epoch 82/200
 - 1s - loss: 0.0224 - acc: 0.9935 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 85/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 86/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 87/200
 - 1s - loss: 0.0221 - acc: 0.9925 - val_loss: 0.0195 - val_acc: 0.9936
Epoch 88/200
 - 1s - loss: 0.0228 - acc: 0.9932 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 89/200
 - 1s - loss: 0.0222 - acc: 0.9934 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 90/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 91/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 92/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 93/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 94/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 95/200
 - 1s - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 96/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 97/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 98/200
 - 1s - loss: 0.0221 - acc: 0.9926 - val_loss: 0.0194 - val_acc: 0.9936
Epoch 99/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 100/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 101/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 102/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 103/200
 - 1s - loss: 0.0219 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 104/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 105/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 106/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 107/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 108/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 109/200
 - 1s - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 110/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 111/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9936
Epoch 112/200
 - 1s - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 113/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 114/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 115/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 116/200
 - 1s - loss: 0.0227 - acc: 0.9926 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 117/200
 - 1s - loss: 0.0224 - acc: 0.9928 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 118/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9936
Epoch 119/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0221 - acc: 0.9933 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0223 - acc: 0.9924 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 127/200
 - 1s - loss: 0.0225 - acc: 0.9928 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 128/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 129/200
 - 1s - loss: 0.0219 - acc: 0.9927 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0218 - acc: 0.9929 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 131/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9936
Epoch 132/200
 - 1s - loss: 0.0210 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 133/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 134/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 135/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 136/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 137/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 138/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 139/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 140/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 141/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 142/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 143/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 144/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 145/200
 - 1s - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 146/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 147/200
 - 1s - loss: 0.0219 - acc: 0.9929 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 148/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 149/200
 - 1s - loss: 0.0223 - acc: 0.9929 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 150/200
 - 1s - loss: 0.0215 - acc: 0.9928 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 151/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 152/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 153/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 154/200
 - 1s - loss: 0.0209 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 155/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0190 - val_acc: 0.9940
Epoch 156/200
 - 1s - loss: 0.0218 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 157/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 158/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 159/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 160/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 161/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 162/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 163/200
 - 1s - loss: 0.0222 - acc: 0.9927 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 164/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 165/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 166/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 167/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 168/200
 - 1s - loss: 0.0215 - acc: 0.9927 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 169/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 170/200
 - 1s - loss: 0.0217 - acc: 0.9930 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 171/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 172/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 173/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 174/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 175/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 176/200
 - 1s - loss: 0.0213 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 177/200
 - 1s - loss: 0.0212 - acc: 0.9934 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 178/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 179/200
 - 1s - loss: 0.0220 - acc: 0.9934 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 180/200
 - 1s - loss: 0.0211 - acc: 0.9932 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 181/200
 - 1s - loss: 0.0221 - acc: 0.9932 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 182/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 183/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 184/200
 - 1s - loss: 0.0216 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 185/200
 - 1s - loss: 0.0216 - acc: 0.9939 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 186/200
 - 1s - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 187/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 188/200
 - 1s - loss: 0.0219 - acc: 0.9926 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 189/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 190/200
 - 1s - loss: 0.0211 - acc: 0.9932 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 191/200
 - 1s - loss: 0.0211 - acc: 0.9935 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 192/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 193/200
 - 1s - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 194/200
 - 1s - loss: 0.0210 - acc: 0.9933 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 195/200
 - 1s - loss: 0.0215 - acc: 0.9927 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 196/200
 - 1s - loss: 0.0221 - acc: 0.9929 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 197/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 198/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 199/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0188 - val_acc: 0.9942
Epoch 200/200
 - 1s - loss: 0.0208 - acc: 0.9935 - val_loss: 0.0188 - val_acc: 0.9942
2018-03-27 12:57:23,104 [INFO] Evaluate...
2018-03-27 12:57:28,261 [INFO] Done!
2018-03-27 12:57:28,268 [INFO] tpe_transform took 0.002463 seconds
2018-03-27 12:57:28,268 [INFO] TPE using 89/89 trials with best loss 0.011121
2018-03-27 12:57:28,276 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 12:57:29,265 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0513 - acc: 0.9796 - val_loss: 0.0233 - val_acc: 0.9932
Epoch 2/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0214 - val_acc: 0.9946
Epoch 3/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0207 - val_acc: 0.9950
Epoch 4/200
 - 1s - loss: 0.0216 - acc: 0.9928 - val_loss: 0.0203 - val_acc: 0.9944
Epoch 5/200
 - 1s - loss: 0.0205 - acc: 0.9940 - val_loss: 0.0201 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0195 - acc: 0.9939 - val_loss: 0.0195 - val_acc: 0.9944
Epoch 9/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0193 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9946
Epoch 13/200
 - 1s - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 14/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9946
Epoch 15/200
 - 1s - loss: 0.0184 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 16/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 17/200
 - 1s - loss: 0.0182 - acc: 0.9947 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 18/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 20/200
 - 1s - loss: 0.0181 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0181 - acc: 0.9949 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 23/200
 - 1s - loss: 0.0179 - acc: 0.9949 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 24/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 25/200
 - 1s - loss: 0.0177 - acc: 0.9948 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 26/200
 - 1s - loss: 0.0178 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 27/200
 - 1s - loss: 0.0176 - acc: 0.9945 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 28/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0187 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 30/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 31/200
 - 1s - loss: 0.0175 - acc: 0.9950 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 32/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0186 - val_acc: 0.9946
Epoch 33/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0186 - val_acc: 0.9948
Epoch 34/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 35/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 36/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 37/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 38/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 39/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 40/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 41/200
 - 1s - loss: 0.0174 - acc: 0.9941 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 42/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 43/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 44/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0184 - val_acc: 0.9946
Epoch 45/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 46/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 47/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 48/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 49/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 50/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0184 - val_acc: 0.9948
Epoch 51/200
 - 1s - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 52/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 53/200
 - 1s - loss: 0.0171 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 54/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 55/200
 - 1s - loss: 0.0171 - acc: 0.9945 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 56/200
 - 1s - loss: 0.0173 - acc: 0.9949 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 57/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 58/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 59/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 60/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 61/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 62/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 63/200
 - 1s - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0183 - val_acc: 0.9948
Epoch 64/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 65/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 66/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 67/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 68/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 69/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 70/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 71/200
 - 1s - loss: 0.0169 - acc: 0.9943 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 72/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 73/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 74/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 75/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 76/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 77/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 78/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 79/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0182 - val_acc: 0.9948
Epoch 80/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 81/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 82/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 83/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 84/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 85/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 86/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 87/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 88/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 89/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 90/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 91/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 92/200
 - 1s - loss: 0.0167 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 93/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 94/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 95/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 96/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 97/200
 - 1s - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 98/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 99/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 100/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 101/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 102/200
 - 1s - loss: 0.0163 - acc: 0.9948 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 103/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 104/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 105/200
 - 1s - loss: 0.0164 - acc: 0.9954 - val_loss: 0.0181 - val_acc: 0.9948
Epoch 106/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 107/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 108/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 109/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 110/200
 - 1s - loss: 0.0165 - acc: 0.9952 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 111/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 112/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 113/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 114/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9948
Epoch 115/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 116/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 117/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 118/200
 - 1s - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 119/200
 - 1s - loss: 0.0163 - acc: 0.9955 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 120/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 121/200
 - 1s - loss: 0.0163 - acc: 0.9954 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 122/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 123/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 124/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 125/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 126/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 127/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 128/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 129/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 130/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 131/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 132/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 133/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 134/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 135/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 136/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 137/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 138/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0180 - val_acc: 0.9946
Epoch 139/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 140/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 141/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 142/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 143/200
 - 1s - loss: 0.0160 - acc: 0.9948 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 144/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 145/200
 - 1s - loss: 0.0163 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 146/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 147/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 148/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 149/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 150/200
 - 1s - loss: 0.0162 - acc: 0.9947 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 151/200
 - 1s - loss: 0.0163 - acc: 0.9947 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 152/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 153/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 154/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 155/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 156/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 157/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 158/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 159/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 160/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 161/200
 - 1s - loss: 0.0160 - acc: 0.9952 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 162/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 163/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 164/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 165/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 166/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 167/200
 - 1s - loss: 0.0160 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 168/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 169/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 170/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 171/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 172/200
 - 1s - loss: 0.0160 - acc: 0.9948 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 173/200
 - 1s - loss: 0.0159 - acc: 0.9954 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 174/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 175/200
 - 1s - loss: 0.0162 - acc: 0.9948 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 176/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 177/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 178/200
 - 1s - loss: 0.0156 - acc: 0.9956 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 179/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 180/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 181/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 182/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 183/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 184/200
 - 1s - loss: 0.0159 - acc: 0.9955 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 185/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 186/200
 - 1s - loss: 0.0160 - acc: 0.9951 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 187/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 188/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 189/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 190/200
 - 1s - loss: 0.0159 - acc: 0.9953 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 191/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 192/200
 - 1s - loss: 0.0160 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 193/200
 - 1s - loss: 0.0161 - acc: 0.9953 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 194/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 195/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 196/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 197/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 198/200
 - 1s - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 199/200
 - 1s - loss: 0.0159 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 200/200
 - 1s - loss: 0.0158 - acc: 0.9949 - val_loss: 0.0178 - val_acc: 0.9946
2018-03-27 13:00:55,462 [INFO] Evaluate...
2018-03-27 13:01:00,810 [INFO] Done!
2018-03-27 13:01:00,816 [INFO] tpe_transform took 0.002485 seconds
2018-03-27 13:01:00,818 [INFO] TPE using 90/90 trials with best loss 0.011121
2018-03-27 13:01:00,826 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:01:01,849 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 8s - loss: 0.0658 - acc: 0.9747 - val_loss: 0.0252 - val_acc: 0.9930
Epoch 2/200
 - 1s - loss: 0.0350 - acc: 0.9877 - val_loss: 0.0236 - val_acc: 0.9936
Epoch 3/200
 - 1s - loss: 0.0337 - acc: 0.9886 - val_loss: 0.0227 - val_acc: 0.9938
Epoch 4/200
 - 1s - loss: 0.0333 - acc: 0.9895 - val_loss: 0.0222 - val_acc: 0.9942
Epoch 5/200
 - 1s - loss: 0.0321 - acc: 0.9895 - val_loss: 0.0218 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0325 - acc: 0.9888 - val_loss: 0.0215 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0320 - acc: 0.9895 - val_loss: 0.0213 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0308 - acc: 0.9896 - val_loss: 0.0211 - val_acc: 0.9944
Epoch 9/200
 - 1s - loss: 0.0309 - acc: 0.9897 - val_loss: 0.0210 - val_acc: 0.9944
Epoch 10/200
 - 1s - loss: 0.0312 - acc: 0.9899 - val_loss: 0.0209 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0310 - acc: 0.9896 - val_loss: 0.0207 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0304 - acc: 0.9905 - val_loss: 0.0206 - val_acc: 0.9946
Epoch 13/200
 - 1s - loss: 0.0304 - acc: 0.9908 - val_loss: 0.0206 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0297 - acc: 0.9899 - val_loss: 0.0205 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0305 - acc: 0.9896 - val_loss: 0.0204 - val_acc: 0.9944
Epoch 16/200
 - 1s - loss: 0.0306 - acc: 0.9896 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 17/200
 - 1s - loss: 0.0309 - acc: 0.9891 - val_loss: 0.0203 - val_acc: 0.9946
Epoch 18/200
 - 1s - loss: 0.0300 - acc: 0.9905 - val_loss: 0.0202 - val_acc: 0.9946
Epoch 19/200
 - 1s - loss: 0.0296 - acc: 0.9904 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 20/200
 - 1s - loss: 0.0300 - acc: 0.9905 - val_loss: 0.0201 - val_acc: 0.9946
Epoch 21/200
 - 1s - loss: 0.0295 - acc: 0.9908 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 22/200
 - 1s - loss: 0.0289 - acc: 0.9893 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 23/200
 - 1s - loss: 0.0287 - acc: 0.9910 - val_loss: 0.0200 - val_acc: 0.9946
Epoch 24/200
 - 1s - loss: 0.0298 - acc: 0.9897 - val_loss: 0.0199 - val_acc: 0.9946
Epoch 25/200
 - 1s - loss: 0.0293 - acc: 0.9897 - val_loss: 0.0199 - val_acc: 0.9944
Epoch 26/200
 - 1s - loss: 0.0295 - acc: 0.9897 - val_loss: 0.0198 - val_acc: 0.9944
Epoch 27/200
 - 1s - loss: 0.0289 - acc: 0.9906 - val_loss: 0.0198 - val_acc: 0.9944
Epoch 28/200
 - 1s - loss: 0.0282 - acc: 0.9911 - val_loss: 0.0198 - val_acc: 0.9944
Epoch 29/200
 - 1s - loss: 0.0288 - acc: 0.9906 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0291 - acc: 0.9901 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0291 - acc: 0.9899 - val_loss: 0.0197 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0303 - acc: 0.9896 - val_loss: 0.0196 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0297 - acc: 0.9893 - val_loss: 0.0196 - val_acc: 0.9944
Epoch 34/200
 - 1s - loss: 0.0279 - acc: 0.9907 - val_loss: 0.0196 - val_acc: 0.9944
Epoch 35/200
 - 1s - loss: 0.0285 - acc: 0.9905 - val_loss: 0.0196 - val_acc: 0.9944
Epoch 36/200
 - 1s - loss: 0.0291 - acc: 0.9900 - val_loss: 0.0195 - val_acc: 0.9944
Epoch 37/200
 - 1s - loss: 0.0290 - acc: 0.9909 - val_loss: 0.0195 - val_acc: 0.9944
Epoch 38/200
 - 1s - loss: 0.0292 - acc: 0.9906 - val_loss: 0.0195 - val_acc: 0.9944
Epoch 39/200
 - 1s - loss: 0.0277 - acc: 0.9913 - val_loss: 0.0195 - val_acc: 0.9944
Epoch 40/200
 - 1s - loss: 0.0288 - acc: 0.9900 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 41/200
 - 1s - loss: 0.0284 - acc: 0.9908 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 42/200
 - 1s - loss: 0.0286 - acc: 0.9904 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 43/200
 - 1s - loss: 0.0283 - acc: 0.9908 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 44/200
 - 1s - loss: 0.0285 - acc: 0.9905 - val_loss: 0.0194 - val_acc: 0.9944
Epoch 45/200
 - 1s - loss: 0.0289 - acc: 0.9904 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 46/200
 - 1s - loss: 0.0291 - acc: 0.9910 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 47/200
 - 1s - loss: 0.0287 - acc: 0.9910 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 48/200
 - 1s - loss: 0.0274 - acc: 0.9911 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0281 - acc: 0.9904 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 50/200
 - 1s - loss: 0.0283 - acc: 0.9896 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 51/200
 - 1s - loss: 0.0275 - acc: 0.9910 - val_loss: 0.0193 - val_acc: 0.9944
Epoch 52/200
 - 1s - loss: 0.0286 - acc: 0.9909 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 53/200
 - 1s - loss: 0.0278 - acc: 0.9911 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 54/200
 - 1s - loss: 0.0277 - acc: 0.9909 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 55/200
 - 1s - loss: 0.0286 - acc: 0.9900 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 56/200
 - 1s - loss: 0.0272 - acc: 0.9912 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 57/200
 - 1s - loss: 0.0271 - acc: 0.9910 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 58/200
 - 1s - loss: 0.0293 - acc: 0.9906 - val_loss: 0.0192 - val_acc: 0.9944
Epoch 59/200
 - 1s - loss: 0.0273 - acc: 0.9913 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 60/200
 - 1s - loss: 0.0274 - acc: 0.9908 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 61/200
 - 1s - loss: 0.0275 - acc: 0.9911 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 62/200
 - 1s - loss: 0.0270 - acc: 0.9916 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 63/200
 - 1s - loss: 0.0285 - acc: 0.9910 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 64/200
 - 1s - loss: 0.0274 - acc: 0.9914 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 65/200
 - 1s - loss: 0.0272 - acc: 0.9908 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 66/200
 - 1s - loss: 0.0284 - acc: 0.9910 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 67/200
 - 1s - loss: 0.0290 - acc: 0.9904 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 68/200
 - 1s - loss: 0.0280 - acc: 0.9902 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 69/200
 - 1s - loss: 0.0277 - acc: 0.9909 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0282 - acc: 0.9900 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 71/200
 - 1s - loss: 0.0273 - acc: 0.9910 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0281 - acc: 0.9908 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0275 - acc: 0.9908 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 74/200
 - 1s - loss: 0.0267 - acc: 0.9914 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 75/200
 - 1s - loss: 0.0287 - acc: 0.9909 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 76/200
 - 1s - loss: 0.0279 - acc: 0.9911 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 77/200
 - 1s - loss: 0.0269 - acc: 0.9911 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 78/200
 - 1s - loss: 0.0280 - acc: 0.9910 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 79/200
 - 1s - loss: 0.0284 - acc: 0.9900 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 80/200
 - 1s - loss: 0.0282 - acc: 0.9904 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 81/200
 - 1s - loss: 0.0269 - acc: 0.9914 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 82/200
 - 1s - loss: 0.0281 - acc: 0.9905 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 83/200
 - 1s - loss: 0.0275 - acc: 0.9906 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 84/200
 - 1s - loss: 0.0282 - acc: 0.9905 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 85/200
 - 1s - loss: 0.0276 - acc: 0.9908 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 86/200
 - 1s - loss: 0.0276 - acc: 0.9909 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 87/200
 - 1s - loss: 0.0279 - acc: 0.9917 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 88/200
 - 1s - loss: 0.0277 - acc: 0.9906 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 89/200
 - 1s - loss: 0.0273 - acc: 0.9911 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 90/200
 - 1s - loss: 0.0270 - acc: 0.9914 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 91/200
 - 1s - loss: 0.0277 - acc: 0.9911 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 92/200
 - 1s - loss: 0.0288 - acc: 0.9905 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 93/200
 - 1s - loss: 0.0270 - acc: 0.9914 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 94/200
 - 1s - loss: 0.0275 - acc: 0.9910 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 95/200
 - 1s - loss: 0.0274 - acc: 0.9917 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 96/200
 - 1s - loss: 0.0281 - acc: 0.9910 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 97/200
 - 1s - loss: 0.0261 - acc: 0.9914 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 98/200
 - 1s - loss: 0.0276 - acc: 0.9906 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 99/200
 - 1s - loss: 0.0277 - acc: 0.9912 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 100/200
 - 1s - loss: 0.0278 - acc: 0.9913 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 101/200
 - 1s - loss: 0.0284 - acc: 0.9902 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 102/200
 - 1s - loss: 0.0275 - acc: 0.9910 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 103/200
 - 1s - loss: 0.0270 - acc: 0.9911 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 104/200
 - 1s - loss: 0.0271 - acc: 0.9911 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 105/200
 - 1s - loss: 0.0280 - acc: 0.9904 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 106/200
 - 1s - loss: 0.0272 - acc: 0.9912 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 107/200
 - 1s - loss: 0.0272 - acc: 0.9906 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 108/200
 - 1s - loss: 0.0280 - acc: 0.9905 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 109/200
 - 1s - loss: 0.0266 - acc: 0.9911 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 110/200
 - 1s - loss: 0.0264 - acc: 0.9912 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 111/200
 - 1s - loss: 0.0283 - acc: 0.9909 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 112/200
 - 1s - loss: 0.0276 - acc: 0.9908 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 113/200
 - 1s - loss: 0.0271 - acc: 0.9911 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 114/200
 - 1s - loss: 0.0270 - acc: 0.9909 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 115/200
 - 1s - loss: 0.0292 - acc: 0.9900 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 116/200
 - 1s - loss: 0.0266 - acc: 0.9915 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 117/200
 - 1s - loss: 0.0265 - acc: 0.9913 - val_loss: 0.0187 - val_acc: 0.9944
Epoch 118/200
 - 1s - loss: 0.0277 - acc: 0.9905 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 119/200
 - 1s - loss: 0.0278 - acc: 0.9910 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 120/200
 - 1s - loss: 0.0275 - acc: 0.9906 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 121/200
 - 1s - loss: 0.0266 - acc: 0.9911 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 122/200
 - 1s - loss: 0.0271 - acc: 0.9906 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 123/200
 - 1s - loss: 0.0269 - acc: 0.9918 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 124/200
 - 1s - loss: 0.0281 - acc: 0.9911 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 125/200
 - 1s - loss: 0.0261 - acc: 0.9915 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 126/200
 - 1s - loss: 0.0278 - acc: 0.9907 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 127/200
 - 1s - loss: 0.0283 - acc: 0.9897 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 128/200
 - 1s - loss: 0.0253 - acc: 0.9915 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 129/200
 - 1s - loss: 0.0266 - acc: 0.9913 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 130/200
 - 1s - loss: 0.0276 - acc: 0.9904 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 131/200
 - 1s - loss: 0.0271 - acc: 0.9909 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 132/200
 - 1s - loss: 0.0256 - acc: 0.9919 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 133/200
 - 1s - loss: 0.0264 - acc: 0.9910 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 134/200
 - 1s - loss: 0.0271 - acc: 0.9904 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 135/200
 - 1s - loss: 0.0268 - acc: 0.9906 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 136/200
 - 1s - loss: 0.0272 - acc: 0.9909 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 137/200
 - 1s - loss: 0.0272 - acc: 0.9908 - val_loss: 0.0186 - val_acc: 0.9944
Epoch 138/200
 - 1s - loss: 0.0273 - acc: 0.9914 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 139/200
 - 1s - loss: 0.0272 - acc: 0.9915 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 140/200
 - 1s - loss: 0.0266 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 141/200
 - 1s - loss: 0.0272 - acc: 0.9910 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 142/200
 - 1s - loss: 0.0268 - acc: 0.9908 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 143/200
 - 1s - loss: 0.0275 - acc: 0.9906 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 144/200
 - 1s - loss: 0.0270 - acc: 0.9910 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 145/200
 - 1s - loss: 0.0277 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 146/200
 - 1s - loss: 0.0272 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 147/200
 - 1s - loss: 0.0265 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 148/200
 - 1s - loss: 0.0267 - acc: 0.9909 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 149/200
 - 1s - loss: 0.0278 - acc: 0.9905 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 150/200
 - 1s - loss: 0.0262 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 151/200
 - 1s - loss: 0.0271 - acc: 0.9916 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 152/200
 - 1s - loss: 0.0273 - acc: 0.9908 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 153/200
 - 1s - loss: 0.0263 - acc: 0.9919 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 154/200
 - 1s - loss: 0.0276 - acc: 0.9914 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 155/200
 - 1s - loss: 0.0278 - acc: 0.9905 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 156/200
 - 1s - loss: 0.0283 - acc: 0.9905 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 157/200
 - 1s - loss: 0.0278 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 158/200
 - 1s - loss: 0.0269 - acc: 0.9911 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 159/200
 - 1s - loss: 0.0278 - acc: 0.9904 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 160/200
 - 1s - loss: 0.0261 - acc: 0.9918 - val_loss: 0.0185 - val_acc: 0.9944
Epoch 161/200
 - 1s - loss: 0.0281 - acc: 0.9906 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 162/200
 - 1s - loss: 0.0263 - acc: 0.9917 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 163/200
 - 1s - loss: 0.0261 - acc: 0.9914 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 164/200
 - 1s - loss: 0.0282 - acc: 0.9906 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 165/200
 - 1s - loss: 0.0265 - acc: 0.9910 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 166/200
 - 1s - loss: 0.0258 - acc: 0.9919 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 167/200
 - 1s - loss: 0.0264 - acc: 0.9914 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 168/200
 - 1s - loss: 0.0259 - acc: 0.9919 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 169/200
 - 1s - loss: 0.0266 - acc: 0.9908 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 170/200
 - 1s - loss: 0.0263 - acc: 0.9919 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 171/200
 - 1s - loss: 0.0277 - acc: 0.9901 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 172/200
 - 1s - loss: 0.0269 - acc: 0.9913 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 173/200
 - 1s - loss: 0.0264 - acc: 0.9918 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 174/200
 - 1s - loss: 0.0266 - acc: 0.9910 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 175/200
 - 1s - loss: 0.0278 - acc: 0.9906 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 176/200
 - 1s - loss: 0.0275 - acc: 0.9905 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 177/200
 - 1s - loss: 0.0263 - acc: 0.9910 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 178/200
 - 1s - loss: 0.0270 - acc: 0.9905 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 179/200
 - 1s - loss: 0.0258 - acc: 0.9918 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 180/200
 - 1s - loss: 0.0252 - acc: 0.9926 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 181/200
 - 1s - loss: 0.0265 - acc: 0.9915 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 182/200
 - 1s - loss: 0.0261 - acc: 0.9919 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 183/200
 - 1s - loss: 0.0281 - acc: 0.9909 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 184/200
 - 1s - loss: 0.0282 - acc: 0.9905 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 185/200
 - 1s - loss: 0.0270 - acc: 0.9907 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 186/200
 - 1s - loss: 0.0273 - acc: 0.9909 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 187/200
 - 1s - loss: 0.0275 - acc: 0.9909 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 188/200
 - 1s - loss: 0.0270 - acc: 0.9910 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 189/200
 - 1s - loss: 0.0269 - acc: 0.9905 - val_loss: 0.0184 - val_acc: 0.9944
Epoch 190/200
 - 1s - loss: 0.0264 - acc: 0.9911 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 191/200
 - 1s - loss: 0.0279 - acc: 0.9911 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 192/200
 - 1s - loss: 0.0255 - acc: 0.9919 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 193/200
 - 1s - loss: 0.0274 - acc: 0.9911 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 194/200
 - 1s - loss: 0.0271 - acc: 0.9906 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 195/200
 - 1s - loss: 0.0258 - acc: 0.9915 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 196/200
 - 1s - loss: 0.0259 - acc: 0.9915 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 197/200
 - 1s - loss: 0.0272 - acc: 0.9909 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 198/200
 - 1s - loss: 0.0270 - acc: 0.9913 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 199/200
 - 1s - loss: 0.0266 - acc: 0.9909 - val_loss: 0.0183 - val_acc: 0.9944
Epoch 200/200
 - 1s - loss: 0.0265 - acc: 0.9913 - val_loss: 0.0183 - val_acc: 0.9944
2018-03-27 13:04:28,413 [INFO] Evaluate...
2018-03-27 13:04:33,711 [INFO] Done!
2018-03-27 13:04:33,718 [INFO] tpe_transform took 0.003267 seconds
2018-03-27 13:04:33,719 [INFO] TPE using 91/91 trials with best loss 0.011121
2018-03-27 13:04:33,726 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:04:34,713 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0439 - acc: 0.9823 - val_loss: 0.0221 - val_acc: 0.9934
Epoch 2/200
 - 1s - loss: 0.0227 - acc: 0.9920 - val_loss: 0.0199 - val_acc: 0.9938
Epoch 3/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 4/200
 - 1s - loss: 0.0179 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 5/200
 - 1s - loss: 0.0162 - acc: 0.9943 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 6/200
 - 1s - loss: 0.0162 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 7/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0188 - val_acc: 0.9944
Epoch 8/200
 - 1s - loss: 0.0158 - acc: 0.9944 - val_loss: 0.0185 - val_acc: 0.9948
Epoch 9/200
 - 1s - loss: 0.0163 - acc: 0.9945 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 10/200
 - 1s - loss: 0.0142 - acc: 0.9950 - val_loss: 0.0181 - val_acc: 0.9946
Epoch 11/200
 - 1s - loss: 0.0152 - acc: 0.9951 - val_loss: 0.0182 - val_acc: 0.9946
Epoch 12/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0182 - val_acc: 0.9944
Epoch 13/200
 - 1s - loss: 0.0140 - acc: 0.9953 - val_loss: 0.0179 - val_acc: 0.9946
Epoch 14/200
 - 1s - loss: 0.0143 - acc: 0.9950 - val_loss: 0.0178 - val_acc: 0.9946
Epoch 15/200
 - 1s - loss: 0.0139 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 16/200
 - 1s - loss: 0.0144 - acc: 0.9954 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 17/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0181 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0137 - acc: 0.9954 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 19/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0178 - val_acc: 0.9948
Epoch 20/200
 - 1s - loss: 0.0133 - acc: 0.9955 - val_loss: 0.0179 - val_acc: 0.9942
Epoch 21/200
 - 1s - loss: 0.0136 - acc: 0.9953 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 22/200
 - 1s - loss: 0.0135 - acc: 0.9950 - val_loss: 0.0177 - val_acc: 0.9948
Epoch 23/200
 - 1s - loss: 0.0136 - acc: 0.9959 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 24/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0176 - val_acc: 0.9948
Epoch 25/200
 - 1s - loss: 0.0141 - acc: 0.9952 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 26/200
 - 1s - loss: 0.0132 - acc: 0.9954 - val_loss: 0.0175 - val_acc: 0.9948
Epoch 27/200
 - 1s - loss: 0.0120 - acc: 0.9960 - val_loss: 0.0177 - val_acc: 0.9946
Epoch 28/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 29/200
 - 1s - loss: 0.0109 - acc: 0.9961 - val_loss: 0.0177 - val_acc: 0.9944
Epoch 30/200
 - 1s - loss: 0.0128 - acc: 0.9957 - val_loss: 0.0177 - val_acc: 0.9944
Epoch 31/200
 - 1s - loss: 0.0134 - acc: 0.9958 - val_loss: 0.0177 - val_acc: 0.9944
Epoch 32/200
 - 1s - loss: 0.0134 - acc: 0.9951 - val_loss: 0.0177 - val_acc: 0.9944
Epoch 33/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0176 - val_acc: 0.9946
Epoch 34/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0177 - val_acc: 0.9944
2018-03-27 13:05:27,249 [INFO] Evaluate...
2018-03-27 13:05:32,600 [INFO] Done!
2018-03-27 13:05:32,607 [INFO] tpe_transform took 0.002645 seconds
2018-03-27 13:05:32,607 [INFO] TPE using 92/92 trials with best loss 0.011121
2018-03-27 13:05:32,614 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:05:33,602 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0499 - acc: 0.9833 - val_loss: 0.0294 - val_acc: 0.9880
Epoch 2/200
 - 1s - loss: 0.0249 - acc: 0.9926 - val_loss: 0.0272 - val_acc: 0.9892
Epoch 3/200
 - 1s - loss: 0.0236 - acc: 0.9931 - val_loss: 0.0264 - val_acc: 0.9902
Epoch 4/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0258 - val_acc: 0.9904
Epoch 5/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0255 - val_acc: 0.9902
Epoch 6/200
 - 1s - loss: 0.0222 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9904
Epoch 7/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 8/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 9/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 10/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 11/200
 - 1s - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0244 - val_acc: 0.9906
Epoch 12/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0242 - val_acc: 0.9908
Epoch 13/200
 - 1s - loss: 0.0207 - acc: 0.9940 - val_loss: 0.0241 - val_acc: 0.9908
Epoch 14/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0240 - val_acc: 0.9908
Epoch 15/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0240 - val_acc: 0.9908
Epoch 16/200
 - 1s - loss: 0.0207 - acc: 0.9938 - val_loss: 0.0239 - val_acc: 0.9908
Epoch 17/200
 - 1s - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0238 - val_acc: 0.9908
Epoch 18/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0237 - val_acc: 0.9908
Epoch 19/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0237 - val_acc: 0.9908
Epoch 20/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9908
Epoch 21/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9908
Epoch 22/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9908
Epoch 23/200
 - 1s - loss: 0.0204 - acc: 0.9939 - val_loss: 0.0235 - val_acc: 0.9908
Epoch 24/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0235 - val_acc: 0.9908
Epoch 25/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0234 - val_acc: 0.9910
Epoch 26/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9910
Epoch 27/200
 - 1s - loss: 0.0201 - acc: 0.9941 - val_loss: 0.0233 - val_acc: 0.9912
Epoch 28/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0233 - val_acc: 0.9912
Epoch 29/200
 - 1s - loss: 0.0202 - acc: 0.9938 - val_loss: 0.0233 - val_acc: 0.9912
Epoch 30/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0232 - val_acc: 0.9912
Epoch 31/200
 - 1s - loss: 0.0202 - acc: 0.9941 - val_loss: 0.0232 - val_acc: 0.9912
Epoch 32/200
 - 1s - loss: 0.0201 - acc: 0.9942 - val_loss: 0.0232 - val_acc: 0.9912
Epoch 33/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0231 - val_acc: 0.9912
Epoch 34/200
 - 1s - loss: 0.0199 - acc: 0.9940 - val_loss: 0.0231 - val_acc: 0.9912
Epoch 35/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0231 - val_acc: 0.9912
Epoch 36/200
 - 1s - loss: 0.0198 - acc: 0.9942 - val_loss: 0.0231 - val_acc: 0.9912
Epoch 37/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0231 - val_acc: 0.9912
Epoch 38/200
 - 1s - loss: 0.0198 - acc: 0.9937 - val_loss: 0.0230 - val_acc: 0.9912
Epoch 39/200
 - 1s - loss: 0.0202 - acc: 0.9939 - val_loss: 0.0230 - val_acc: 0.9912
Epoch 40/200
 - 1s - loss: 0.0199 - acc: 0.9942 - val_loss: 0.0230 - val_acc: 0.9912
Epoch 41/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0230 - val_acc: 0.9914
Epoch 42/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0229 - val_acc: 0.9914
Epoch 43/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0229 - val_acc: 0.9914
Epoch 44/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0229 - val_acc: 0.9914
Epoch 45/200
 - 1s - loss: 0.0196 - acc: 0.9941 - val_loss: 0.0229 - val_acc: 0.9914
Epoch 46/200
 - 1s - loss: 0.0197 - acc: 0.9944 - val_loss: 0.0229 - val_acc: 0.9916
Epoch 47/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0228 - val_acc: 0.9916
Epoch 48/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0228 - val_acc: 0.9916
Epoch 49/200
 - 1s - loss: 0.0194 - acc: 0.9946 - val_loss: 0.0228 - val_acc: 0.9916
Epoch 50/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0228 - val_acc: 0.9916
Epoch 51/200
 - 1s - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0228 - val_acc: 0.9916
Epoch 52/200
 - 1s - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0227 - val_acc: 0.9916
Epoch 53/200
 - 1s - loss: 0.0195 - acc: 0.9936 - val_loss: 0.0227 - val_acc: 0.9916
Epoch 54/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 55/200
 - 1s - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 56/200
 - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 57/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0227 - val_acc: 0.9920
Epoch 58/200
 - 1s - loss: 0.0193 - acc: 0.9947 - val_loss: 0.0227 - val_acc: 0.9918
Epoch 59/200
 - 1s - loss: 0.0195 - acc: 0.9944 - val_loss: 0.0227 - val_acc: 0.9920
Epoch 60/200
 - 1s - loss: 0.0195 - acc: 0.9943 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 61/200
 - 1s - loss: 0.0195 - acc: 0.9945 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 62/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 63/200
 - 1s - loss: 0.0195 - acc: 0.9942 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 64/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0226 - val_acc: 0.9922
Epoch 65/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 66/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 67/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0226 - val_acc: 0.9920
Epoch 68/200
 - 1s - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 69/200
 - 1s - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0225 - val_acc: 0.9922
Epoch 70/200
 - 1s - loss: 0.0194 - acc: 0.9941 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 71/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 72/200
 - 1s - loss: 0.0192 - acc: 0.9944 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 73/200
 - 1s - loss: 0.0191 - acc: 0.9945 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 74/200
 - 1s - loss: 0.0193 - acc: 0.9942 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 75/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 76/200
 - 1s - loss: 0.0192 - acc: 0.9947 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 77/200
 - 1s - loss: 0.0193 - acc: 0.9940 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 78/200
 - 1s - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0225 - val_acc: 0.9920
Epoch 79/200
 - 1s - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 80/200
 - 1s - loss: 0.0193 - acc: 0.9944 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 81/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 82/200
 - 1s - loss: 0.0194 - acc: 0.9944 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 83/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 84/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 85/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 86/200
 - 1s - loss: 0.0190 - acc: 0.9942 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 87/200
 - 1s - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 88/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0224 - val_acc: 0.9920
Epoch 89/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 90/200
 - 1s - loss: 0.0190 - acc: 0.9949 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 91/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 92/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 93/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 94/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 95/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 96/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 97/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 98/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 99/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 100/200
 - 1s - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 101/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0223 - val_acc: 0.9920
Epoch 102/200
 - 1s - loss: 0.0188 - acc: 0.9942 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 103/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 104/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 105/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 106/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 107/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 108/200
 - 1s - loss: 0.0186 - acc: 0.9948 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 109/200
 - 1s - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 110/200
 - 1s - loss: 0.0191 - acc: 0.9941 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 111/200
 - 1s - loss: 0.0190 - acc: 0.9945 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 112/200
 - 1s - loss: 0.0187 - acc: 0.9947 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 113/200
 - 1s - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0222 - val_acc: 0.9920
Epoch 114/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 115/200
 - 1s - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 116/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 117/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0222 - val_acc: 0.9922
Epoch 118/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 119/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 120/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 121/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 122/200
 - 1s - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 123/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 124/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 125/200
 - 1s - loss: 0.0187 - acc: 0.9949 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 126/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 127/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 128/200
 - 1s - loss: 0.0185 - acc: 0.9945 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 129/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 130/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 131/200
 - 1s - loss: 0.0188 - acc: 0.9947 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 132/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 133/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 134/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 135/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 136/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 137/200
 - 1s - loss: 0.0192 - acc: 0.9942 - val_loss: 0.0221 - val_acc: 0.9922
Epoch 138/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 139/200
 - 1s - loss: 0.0186 - acc: 0.9947 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 140/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 141/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 142/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 143/200
 - 1s - loss: 0.0189 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 144/200
 - 1s - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 145/200
 - 1s - loss: 0.0183 - acc: 0.9949 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 146/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 147/200
 - 1s - loss: 0.0183 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 148/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 149/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 150/200
 - 1s - loss: 0.0189 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 151/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 152/200
 - 1s - loss: 0.0188 - acc: 0.9947 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 153/200
 - 1s - loss: 0.0187 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 154/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 155/200
 - 1s - loss: 0.0190 - acc: 0.9945 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 156/200
 - 1s - loss: 0.0189 - acc: 0.9944 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 157/200
 - 1s - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 158/200
 - 1s - loss: 0.0182 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 159/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 160/200
 - 1s - loss: 0.0186 - acc: 0.9946 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 161/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 162/200
 - 1s - loss: 0.0185 - acc: 0.9947 - val_loss: 0.0220 - val_acc: 0.9922
Epoch 163/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 164/200
 - 1s - loss: 0.0188 - acc: 0.9946 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 165/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 166/200
 - 1s - loss: 0.0187 - acc: 0.9942 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 167/200
 - 1s - loss: 0.0182 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 168/200
 - 1s - loss: 0.0188 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 169/200
 - 1s - loss: 0.0187 - acc: 0.9943 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 170/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 171/200
 - 1s - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 172/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 173/200
 - 1s - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 174/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 175/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 176/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 177/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 178/200
 - 1s - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 179/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 180/200
 - 1s - loss: 0.0186 - acc: 0.9943 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 181/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 182/200
 - 1s - loss: 0.0185 - acc: 0.9950 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 183/200
 - 1s - loss: 0.0183 - acc: 0.9946 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 184/200
 - 1s - loss: 0.0182 - acc: 0.9948 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 185/200
 - 1s - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 186/200
 - 1s - loss: 0.0187 - acc: 0.9946 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 187/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 188/200
 - 1s - loss: 0.0191 - acc: 0.9941 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 189/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 190/200
 - 1s - loss: 0.0186 - acc: 0.9943 - val_loss: 0.0219 - val_acc: 0.9922
Epoch 191/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 192/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 193/200
 - 1s - loss: 0.0185 - acc: 0.9943 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 194/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 195/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 196/200
 - 1s - loss: 0.0187 - acc: 0.9948 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 197/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 198/200
 - 1s - loss: 0.0184 - acc: 0.9948 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 199/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0218 - val_acc: 0.9922
Epoch 200/200
 - 1s - loss: 0.0184 - acc: 0.9948 - val_loss: 0.0218 - val_acc: 0.9922
2018-03-27 13:09:01,601 [INFO] Evaluate...
2018-03-27 13:09:07,049 [INFO] Done!
2018-03-27 13:09:07,056 [INFO] tpe_transform took 0.002602 seconds
2018-03-27 13:09:07,056 [INFO] TPE using 93/93 trials with best loss 0.011121
2018-03-27 13:09:07,063 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:09:08,058 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0666 - acc: 0.9754 - val_loss: 0.0254 - val_acc: 0.9936
Epoch 2/200
 - 1s - loss: 0.0298 - acc: 0.9908 - val_loss: 0.0215 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0258 - acc: 0.9922 - val_loss: 0.0204 - val_acc: 0.9932
Epoch 4/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0200 - val_acc: 0.9932
Epoch 5/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0191 - val_acc: 0.9938
Epoch 6/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0190 - val_acc: 0.9936
Epoch 7/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0186 - val_acc: 0.9940
Epoch 8/200
 - 1s - loss: 0.0221 - acc: 0.9928 - val_loss: 0.0186 - val_acc: 0.9936
Epoch 9/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0182 - val_acc: 0.9938
Epoch 10/200
 - 1s - loss: 0.0206 - acc: 0.9938 - val_loss: 0.0180 - val_acc: 0.9938
Epoch 11/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0179 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0200 - acc: 0.9937 - val_loss: 0.0178 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0177 - val_acc: 0.9944
Epoch 14/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0176 - val_acc: 0.9944
Epoch 15/200
 - 1s - loss: 0.0193 - acc: 0.9941 - val_loss: 0.0175 - val_acc: 0.9942
Epoch 16/200
 - 1s - loss: 0.0196 - acc: 0.9946 - val_loss: 0.0175 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9944
Epoch 18/200
 - 1s - loss: 0.0191 - acc: 0.9944 - val_loss: 0.0173 - val_acc: 0.9942
Epoch 19/200
 - 1s - loss: 0.0190 - acc: 0.9946 - val_loss: 0.0174 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0173 - val_acc: 0.9938
Epoch 21/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0172 - val_acc: 0.9940
Epoch 22/200
 - 1s - loss: 0.0187 - acc: 0.9944 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 23/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9940
Epoch 24/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0171 - val_acc: 0.9936
Epoch 25/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 26/200
 - 1s - loss: 0.0182 - acc: 0.9940 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 27/200
 - 1s - loss: 0.0185 - acc: 0.9944 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0170 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0180 - acc: 0.9947 - val_loss: 0.0170 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0169 - val_acc: 0.9940
Epoch 32/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0169 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0177 - acc: 0.9940 - val_loss: 0.0168 - val_acc: 0.9938
Epoch 34/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 35/200
 - 1s - loss: 0.0175 - acc: 0.9948 - val_loss: 0.0168 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 37/200
 - 1s - loss: 0.0178 - acc: 0.9949 - val_loss: 0.0167 - val_acc: 0.9940
Epoch 38/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0167 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0180 - acc: 0.9943 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 40/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 41/200
 - 1s - loss: 0.0177 - acc: 0.9939 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 42/200
 - 1s - loss: 0.0180 - acc: 0.9943 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 43/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0165 - val_acc: 0.9942
Epoch 44/200
 - 1s - loss: 0.0167 - acc: 0.9955 - val_loss: 0.0165 - val_acc: 0.9940
Epoch 45/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0166 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0173 - acc: 0.9948 - val_loss: 0.0166 - val_acc: 0.9938
Epoch 47/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0165 - val_acc: 0.9940
Epoch 48/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 49/200
 - 1s - loss: 0.0171 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 50/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 51/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0165 - val_acc: 0.9944
Epoch 52/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 53/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 54/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 55/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 56/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 57/200
 - 1s - loss: 0.0167 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 58/200
 - 1s - loss: 0.0163 - acc: 0.9953 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 59/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 60/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9944
Epoch 61/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0164 - val_acc: 0.9942
Epoch 62/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0164 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0163 - acc: 0.9946 - val_loss: 0.0164 - val_acc: 0.9942
Epoch 64/200
 - 1s - loss: 0.0162 - acc: 0.9958 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 65/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 66/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0164 - val_acc: 0.9942
Epoch 67/200
 - 1s - loss: 0.0160 - acc: 0.9960 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 68/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 69/200
 - 1s - loss: 0.0160 - acc: 0.9947 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 71/200
 - 1s - loss: 0.0160 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0161 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 74/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 75/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 76/200
 - 1s - loss: 0.0158 - acc: 0.9955 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 77/200
 - 1s - loss: 0.0163 - acc: 0.9956 - val_loss: 0.0162 - val_acc: 0.9944
Epoch 78/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 79/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 80/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 81/200
 - 1s - loss: 0.0163 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 82/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 83/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 84/200
 - 1s - loss: 0.0163 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 85/200
 - 1s - loss: 0.0161 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9944
Epoch 86/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9944
Epoch 87/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 88/200
 - 1s - loss: 0.0162 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 89/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 90/200
 - 1s - loss: 0.0160 - acc: 0.9954 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 91/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0163 - val_acc: 0.9942
Epoch 92/200
 - 1s - loss: 0.0155 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 93/200
 - 1s - loss: 0.0152 - acc: 0.9960 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 94/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 95/200
 - 1s - loss: 0.0158 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 96/200
 - 1s - loss: 0.0159 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 97/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 98/200
 - 1s - loss: 0.0155 - acc: 0.9955 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 99/200
 - 1s - loss: 0.0161 - acc: 0.9952 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 100/200
 - 1s - loss: 0.0160 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 101/200
 - 1s - loss: 0.0165 - acc: 0.9947 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 102/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 103/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 104/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0162 - val_acc: 0.9942
Epoch 105/200
 - 1s - loss: 0.0157 - acc: 0.9956 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 106/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 107/200
 - 1s - loss: 0.0158 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 108/200
 - 1s - loss: 0.0157 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 109/200
 - 1s - loss: 0.0163 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 110/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 111/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 112/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 113/200
 - 1s - loss: 0.0151 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 114/200
 - 1s - loss: 0.0150 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 115/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 116/200
 - 1s - loss: 0.0154 - acc: 0.9959 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 117/200
 - 1s - loss: 0.0158 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 118/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 119/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 120/200
 - 1s - loss: 0.0158 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 121/200
 - 1s - loss: 0.0156 - acc: 0.9954 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 122/200
 - 1s - loss: 0.0160 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 123/200
 - 1s - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 124/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 125/200
 - 1s - loss: 0.0160 - acc: 0.9955 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 126/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 127/200
 - 1s - loss: 0.0156 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 128/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 129/200
 - 1s - loss: 0.0156 - acc: 0.9952 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 130/200
 - 1s - loss: 0.0154 - acc: 0.9953 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 131/200
 - 1s - loss: 0.0159 - acc: 0.9945 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 132/200
 - 1s - loss: 0.0157 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 133/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 134/200
 - 1s - loss: 0.0157 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 135/200
 - 1s - loss: 0.0155 - acc: 0.9954 - val_loss: 0.0160 - val_acc: 0.9942
Epoch 136/200
 - 1s - loss: 0.0157 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 137/200
 - 1s - loss: 0.0155 - acc: 0.9950 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 138/200
 - 1s - loss: 0.0151 - acc: 0.9956 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 139/200
 - 1s - loss: 0.0159 - acc: 0.9951 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 140/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0161 - val_acc: 0.9942
Epoch 141/200
 - 1s - loss: 0.0151 - acc: 0.9955 - val_loss: 0.0160 - val_acc: 0.9942
2018-03-27 13:11:39,959 [INFO] Evaluate...
2018-03-27 13:11:45,391 [INFO] Done!
2018-03-27 13:11:45,399 [INFO] tpe_transform took 0.003388 seconds
2018-03-27 13:11:45,400 [INFO] TPE using 94/94 trials with best loss 0.011121
2018-03-27 13:11:45,407 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:11:46,396 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0407 - acc: 0.9853 - val_loss: 0.0282 - val_acc: 0.9910
Epoch 2/200
 - 1s - loss: 0.0184 - acc: 0.9946 - val_loss: 0.0262 - val_acc: 0.9918
Epoch 3/200
 - 1s - loss: 0.0179 - acc: 0.9942 - val_loss: 0.0257 - val_acc: 0.9918
Epoch 4/200
 - 1s - loss: 0.0164 - acc: 0.9952 - val_loss: 0.0251 - val_acc: 0.9918
Epoch 5/200
 - 1s - loss: 0.0159 - acc: 0.9947 - val_loss: 0.0248 - val_acc: 0.9918
Epoch 6/200
 - 1s - loss: 0.0151 - acc: 0.9956 - val_loss: 0.0246 - val_acc: 0.9922
Epoch 7/200
 - 1s - loss: 0.0147 - acc: 0.9956 - val_loss: 0.0249 - val_acc: 0.9920
Epoch 8/200
 - 1s - loss: 0.0148 - acc: 0.9951 - val_loss: 0.0243 - val_acc: 0.9924
Epoch 9/200
 - 1s - loss: 0.0144 - acc: 0.9955 - val_loss: 0.0243 - val_acc: 0.9922
Epoch 10/200
 - 1s - loss: 0.0150 - acc: 0.9955 - val_loss: 0.0241 - val_acc: 0.9924
Epoch 11/200
 - 1s - loss: 0.0146 - acc: 0.9955 - val_loss: 0.0240 - val_acc: 0.9924
Epoch 12/200
 - 1s - loss: 0.0139 - acc: 0.9956 - val_loss: 0.0240 - val_acc: 0.9924
Epoch 13/200
 - 1s - loss: 0.0143 - acc: 0.9955 - val_loss: 0.0238 - val_acc: 0.9924
Epoch 14/200
 - 1s - loss: 0.0137 - acc: 0.9958 - val_loss: 0.0238 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0137 - acc: 0.9955 - val_loss: 0.0238 - val_acc: 0.9922
Epoch 16/200
 - 1s - loss: 0.0136 - acc: 0.9960 - val_loss: 0.0238 - val_acc: 0.9922
Epoch 17/200
 - 1s - loss: 0.0139 - acc: 0.9958 - val_loss: 0.0237 - val_acc: 0.9922
Epoch 18/200
 - 1s - loss: 0.0133 - acc: 0.9960 - val_loss: 0.0237 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0131 - acc: 0.9960 - val_loss: 0.0237 - val_acc: 0.9924
Epoch 20/200
 - 1s - loss: 0.0137 - acc: 0.9957 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 21/200
 - 1s - loss: 0.0128 - acc: 0.9961 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 22/200
 - 1s - loss: 0.0133 - acc: 0.9956 - val_loss: 0.0236 - val_acc: 0.9924
Epoch 23/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0235 - val_acc: 0.9924
Epoch 24/200
 - 1s - loss: 0.0131 - acc: 0.9954 - val_loss: 0.0235 - val_acc: 0.9924
Epoch 25/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0235 - val_acc: 0.9924
Epoch 26/200
 - 1s - loss: 0.0134 - acc: 0.9960 - val_loss: 0.0234 - val_acc: 0.9922
Epoch 27/200
 - 1s - loss: 0.0133 - acc: 0.9959 - val_loss: 0.0235 - val_acc: 0.9924
Epoch 28/200
 - 1s - loss: 0.0129 - acc: 0.9961 - val_loss: 0.0234 - val_acc: 0.9922
Epoch 29/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0234 - val_acc: 0.9922
Epoch 30/200
 - 1s - loss: 0.0134 - acc: 0.9961 - val_loss: 0.0234 - val_acc: 0.9922
Epoch 31/200
 - 1s - loss: 0.0131 - acc: 0.9958 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 32/200
 - 1s - loss: 0.0129 - acc: 0.9959 - val_loss: 0.0234 - val_acc: 0.9922
Epoch 33/200
 - 1s - loss: 0.0126 - acc: 0.9961 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 34/200
 - 1s - loss: 0.0127 - acc: 0.9965 - val_loss: 0.0234 - val_acc: 0.9924
Epoch 35/200
 - 1s - loss: 0.0127 - acc: 0.9963 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 36/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 37/200
 - 1s - loss: 0.0132 - acc: 0.9962 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 38/200
 - 1s - loss: 0.0128 - acc: 0.9960 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 39/200
 - 1s - loss: 0.0129 - acc: 0.9960 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 40/200
 - 1s - loss: 0.0124 - acc: 0.9962 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 41/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0233 - val_acc: 0.9922
Epoch 42/200
 - 1s - loss: 0.0130 - acc: 0.9961 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 43/200
 - 1s - loss: 0.0123 - acc: 0.9964 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 44/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 45/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 46/200
 - 1s - loss: 0.0127 - acc: 0.9961 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 47/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 48/200
 - 1s - loss: 0.0124 - acc: 0.9965 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 49/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 50/200
 - 1s - loss: 0.0125 - acc: 0.9963 - val_loss: 0.0232 - val_acc: 0.9922
Epoch 51/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0231 - val_acc: 0.9922
Epoch 52/200
 - 1s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 53/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0126 - acc: 0.9960 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 55/200
 - 1s - loss: 0.0123 - acc: 0.9960 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 56/200
 - 1s - loss: 0.0126 - acc: 0.9961 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 57/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 58/200
 - 1s - loss: 0.0117 - acc: 0.9967 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 59/200
 - 1s - loss: 0.0129 - acc: 0.9958 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 60/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 61/200
 - 1s - loss: 0.0122 - acc: 0.9964 - val_loss: 0.0231 - val_acc: 0.9924
Epoch 62/200
 - 1s - loss: 0.0120 - acc: 0.9964 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 63/200
 - 1s - loss: 0.0123 - acc: 0.9962 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 64/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 65/200
 - 1s - loss: 0.0121 - acc: 0.9961 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 66/200
 - 1s - loss: 0.0119 - acc: 0.9966 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 67/200
 - 1s - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 68/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 69/200
 - 1s - loss: 0.0123 - acc: 0.9959 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 70/200
 - 1s - loss: 0.0117 - acc: 0.9962 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 71/200
 - 1s - loss: 0.0121 - acc: 0.9965 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 72/200
 - 1s - loss: 0.0123 - acc: 0.9962 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 73/200
 - 1s - loss: 0.0124 - acc: 0.9963 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 74/200
 - 1s - loss: 0.0122 - acc: 0.9961 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 75/200
 - 1s - loss: 0.0121 - acc: 0.9964 - val_loss: 0.0230 - val_acc: 0.9924
Epoch 76/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 77/200
 - 1s - loss: 0.0122 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 78/200
 - 1s - loss: 0.0123 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 79/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 80/200
 - 1s - loss: 0.0119 - acc: 0.9961 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 81/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 82/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 83/200
 - 1s - loss: 0.0120 - acc: 0.9968 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 84/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 85/200
 - 1s - loss: 0.0118 - acc: 0.9961 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 86/200
 - 1s - loss: 0.0124 - acc: 0.9964 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 87/200
 - 1s - loss: 0.0116 - acc: 0.9967 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 88/200
 - 1s - loss: 0.0120 - acc: 0.9967 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 89/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 90/200
 - 1s - loss: 0.0120 - acc: 0.9968 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 91/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 92/200
 - 1s - loss: 0.0116 - acc: 0.9970 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 93/200
 - 1s - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 94/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 95/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 96/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 97/200
 - 1s - loss: 0.0116 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 98/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 99/200
 - 1s - loss: 0.0115 - acc: 0.9966 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 100/200
 - 1s - loss: 0.0118 - acc: 0.9964 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 101/200
 - 1s - loss: 0.0118 - acc: 0.9965 - val_loss: 0.0229 - val_acc: 0.9924
Epoch 102/200
 - 1s - loss: 0.0115 - acc: 0.9964 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 103/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 104/200
 - 1s - loss: 0.0113 - acc: 0.9969 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 105/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 106/200
 - 1s - loss: 0.0119 - acc: 0.9964 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 107/200
 - 1s - loss: 0.0119 - acc: 0.9960 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 108/200
 - 1s - loss: 0.0116 - acc: 0.9966 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 109/200
 - 1s - loss: 0.0118 - acc: 0.9963 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 110/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 111/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 112/200
 - 1s - loss: 0.0118 - acc: 0.9966 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 113/200
 - 1s - loss: 0.0120 - acc: 0.9961 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 114/200
 - 1s - loss: 0.0120 - acc: 0.9967 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 115/200
 - 1s - loss: 0.0116 - acc: 0.9967 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 116/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 117/200
 - 1s - loss: 0.0117 - acc: 0.9965 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 118/200
 - 1s - loss: 0.0109 - acc: 0.9974 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 119/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 120/200
 - 1s - loss: 0.0112 - acc: 0.9968 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 121/200
 - 1s - loss: 0.0113 - acc: 0.9964 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 122/200
 - 1s - loss: 0.0118 - acc: 0.9968 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 123/200
 - 1s - loss: 0.0111 - acc: 0.9967 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 124/200
 - 1s - loss: 0.0116 - acc: 0.9970 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 125/200
 - 1s - loss: 0.0115 - acc: 0.9967 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 126/200
 - 1s - loss: 0.0119 - acc: 0.9965 - val_loss: 0.0228 - val_acc: 0.9924
Epoch 127/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 128/200
 - 1s - loss: 0.0118 - acc: 0.9966 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 129/200
 - 1s - loss: 0.0120 - acc: 0.9965 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 130/200
 - 1s - loss: 0.0110 - acc: 0.9969 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 131/200
 - 1s - loss: 0.0118 - acc: 0.9966 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 132/200
 - 1s - loss: 0.0121 - acc: 0.9963 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 133/200
 - 1s - loss: 0.0115 - acc: 0.9967 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 134/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 135/200
 - 1s - loss: 0.0114 - acc: 0.9972 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 136/200
 - 1s - loss: 0.0115 - acc: 0.9965 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 137/200
 - 1s - loss: 0.0114 - acc: 0.9965 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 138/200
 - 1s - loss: 0.0116 - acc: 0.9969 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 139/200
 - 1s - loss: 0.0116 - acc: 0.9965 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 140/200
 - 1s - loss: 0.0111 - acc: 0.9968 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 141/200
 - 1s - loss: 0.0114 - acc: 0.9965 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 142/200
 - 1s - loss: 0.0117 - acc: 0.9969 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 143/200
 - 1s - loss: 0.0120 - acc: 0.9963 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 144/200
 - 1s - loss: 0.0117 - acc: 0.9961 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 145/200
 - 1s - loss: 0.0116 - acc: 0.9968 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 146/200
 - 1s - loss: 0.0122 - acc: 0.9960 - val_loss: 0.0227 - val_acc: 0.9924
Epoch 147/200
 - 1s - loss: 0.0114 - acc: 0.9966 - val_loss: 0.0227 - val_acc: 0.9924
2018-03-27 13:14:24,650 [INFO] Evaluate...
2018-03-27 13:14:30,100 [INFO] Done!
2018-03-27 13:14:30,107 [INFO] tpe_transform took 0.002446 seconds
2018-03-27 13:14:30,108 [INFO] TPE using 95/95 trials with best loss 0.011121
2018-03-27 13:14:30,115 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:14:31,103 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0482 - acc: 0.9795 - val_loss: 0.0173 - val_acc: 0.9936
Epoch 2/200
 - 1s - loss: 0.0239 - acc: 0.9926 - val_loss: 0.0151 - val_acc: 0.9954
Epoch 3/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0147 - val_acc: 0.9956
Epoch 4/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0143 - val_acc: 0.9956
Epoch 5/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0141 - val_acc: 0.9956
Epoch 6/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0141 - val_acc: 0.9954
Epoch 7/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0138 - val_acc: 0.9956
Epoch 8/200
 - 1s - loss: 0.0199 - acc: 0.9938 - val_loss: 0.0137 - val_acc: 0.9958
Epoch 9/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0136 - val_acc: 0.9958
Epoch 10/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0135 - val_acc: 0.9958
Epoch 11/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0135 - val_acc: 0.9958
Epoch 12/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0135 - val_acc: 0.9956
Epoch 13/200
 - 1s - loss: 0.0194 - acc: 0.9940 - val_loss: 0.0134 - val_acc: 0.9958
Epoch 14/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0134 - val_acc: 0.9958
Epoch 15/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0133 - val_acc: 0.9956
Epoch 16/200
 - 1s - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0133 - val_acc: 0.9958
Epoch 17/200
 - 1s - loss: 0.0190 - acc: 0.9944 - val_loss: 0.0132 - val_acc: 0.9958
Epoch 18/200
 - 1s - loss: 0.0188 - acc: 0.9937 - val_loss: 0.0132 - val_acc: 0.9958
Epoch 19/200
 - 1s - loss: 0.0187 - acc: 0.9938 - val_loss: 0.0131 - val_acc: 0.9958
Epoch 20/200
 - 1s - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0131 - val_acc: 0.9958
Epoch 21/200
 - 1s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.0131 - val_acc: 0.9956
Epoch 22/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0131 - val_acc: 0.9958
Epoch 23/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0130 - val_acc: 0.9958
Epoch 24/200
 - 1s - loss: 0.0182 - acc: 0.9948 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 25/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 26/200
 - 1s - loss: 0.0185 - acc: 0.9946 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 27/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0130 - val_acc: 0.9958
Epoch 28/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0130 - val_acc: 0.9956
Epoch 29/200
 - 1s - loss: 0.0182 - acc: 0.9941 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 30/200
 - 1s - loss: 0.0181 - acc: 0.9945 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 31/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 32/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 33/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 34/200
 - 1s - loss: 0.0182 - acc: 0.9945 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 35/200
 - 1s - loss: 0.0180 - acc: 0.9946 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 36/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0129 - val_acc: 0.9958
Epoch 37/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 38/200
 - 1s - loss: 0.0179 - acc: 0.9947 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 39/200
 - 1s - loss: 0.0181 - acc: 0.9944 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 40/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 41/200
 - 1s - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 42/200
 - 1s - loss: 0.0182 - acc: 0.9943 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 43/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 44/200
 - 1s - loss: 0.0175 - acc: 0.9944 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 45/200
 - 1s - loss: 0.0178 - acc: 0.9943 - val_loss: 0.0128 - val_acc: 0.9958
Epoch 46/200
 - 1s - loss: 0.0178 - acc: 0.9948 - val_loss: 0.0127 - val_acc: 0.9958
Epoch 47/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0127 - val_acc: 0.9958
Epoch 48/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0127 - val_acc: 0.9958
Epoch 49/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0127 - val_acc: 0.9958
Epoch 50/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 51/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 52/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 53/200
 - 1s - loss: 0.0175 - acc: 0.9945 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 54/200
 - 1s - loss: 0.0175 - acc: 0.9948 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 55/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 56/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0127 - val_acc: 0.9960
Epoch 57/200
 - 1s - loss: 0.0176 - acc: 0.9948 - val_loss: 0.0126 - val_acc: 0.9960
Epoch 58/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9960
Epoch 59/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0126 - val_acc: 0.9960
Epoch 60/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9960
Epoch 61/200
 - 1s - loss: 0.0177 - acc: 0.9949 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 62/200
 - 1s - loss: 0.0176 - acc: 0.9947 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 63/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 64/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 65/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 66/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0126 - val_acc: 0.9960
Epoch 67/200
 - 1s - loss: 0.0174 - acc: 0.9948 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 68/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9960
Epoch 69/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 70/200
 - 1s - loss: 0.0176 - acc: 0.9944 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 71/200
 - 1s - loss: 0.0173 - acc: 0.9946 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 72/200
 - 1s - loss: 0.0171 - acc: 0.9943 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 73/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0126 - val_acc: 0.9962
Epoch 74/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0125 - val_acc: 0.9962
Epoch 75/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9964
Epoch 76/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9962
Epoch 77/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0125 - val_acc: 0.9964
Epoch 78/200
 - 1s - loss: 0.0175 - acc: 0.9946 - val_loss: 0.0125 - val_acc: 0.9964
Epoch 79/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9964
Epoch 80/200
 - 1s - loss: 0.0173 - acc: 0.9951 - val_loss: 0.0125 - val_acc: 0.9964
Epoch 81/200
 - 1s - loss: 0.0176 - acc: 0.9950 - val_loss: 0.0125 - val_acc: 0.9964
Epoch 82/200
 - 1s - loss: 0.0177 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 83/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 84/200
 - 1s - loss: 0.0172 - acc: 0.9949 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 85/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 86/200
 - 1s - loss: 0.0172 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 87/200
 - 1s - loss: 0.0173 - acc: 0.9950 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 88/200
 - 1s - loss: 0.0169 - acc: 0.9948 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 89/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 90/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 91/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 92/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 93/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 94/200
 - 1s - loss: 0.0174 - acc: 0.9944 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 95/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 96/200
 - 1s - loss: 0.0170 - acc: 0.9946 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 97/200
 - 1s - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0125 - val_acc: 0.9966
Epoch 98/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 99/200
 - 1s - loss: 0.0171 - acc: 0.9949 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 100/200
 - 1s - loss: 0.0173 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 101/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 102/200
 - 1s - loss: 0.0171 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 103/200
 - 1s - loss: 0.0171 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 104/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 105/200
 - 1s - loss: 0.0174 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 106/200
 - 1s - loss: 0.0169 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 107/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 108/200
 - 1s - loss: 0.0169 - acc: 0.9951 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 109/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 110/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 111/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 112/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 113/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 114/200
 - 1s - loss: 0.0172 - acc: 0.9946 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 115/200
 - 1s - loss: 0.0170 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 116/200
 - 1s - loss: 0.0169 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 117/200
 - 1s - loss: 0.0167 - acc: 0.9951 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 118/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 119/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 120/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 121/200
 - 1s - loss: 0.0170 - acc: 0.9951 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 122/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 123/200
 - 1s - loss: 0.0162 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 124/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 125/200
 - 1s - loss: 0.0171 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 126/200
 - 1s - loss: 0.0168 - acc: 0.9951 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 127/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 128/200
 - 1s - loss: 0.0170 - acc: 0.9949 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 129/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 130/200
 - 1s - loss: 0.0167 - acc: 0.9955 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 131/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 132/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 133/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 134/200
 - 1s - loss: 0.0170 - acc: 0.9947 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 135/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 136/200
 - 1s - loss: 0.0164 - acc: 0.9948 - val_loss: 0.0124 - val_acc: 0.9966
Epoch 137/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 138/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 139/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 140/200
 - 1s - loss: 0.0162 - acc: 0.9953 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 141/200
 - 1s - loss: 0.0166 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 142/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 143/200
 - 1s - loss: 0.0168 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 144/200
 - 1s - loss: 0.0164 - acc: 0.9953 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 145/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 146/200
 - 1s - loss: 0.0166 - acc: 0.9953 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 147/200
 - 1s - loss: 0.0167 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 148/200
 - 1s - loss: 0.0165 - acc: 0.9946 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 149/200
 - 1s - loss: 0.0172 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 150/200
 - 1s - loss: 0.0166 - acc: 0.9946 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 151/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 152/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 153/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 154/200
 - 1s - loss: 0.0169 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 155/200
 - 1s - loss: 0.0167 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 156/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 157/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 158/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 159/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 160/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 161/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 162/200
 - 1s - loss: 0.0167 - acc: 0.9955 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 163/200
 - 1s - loss: 0.0168 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 164/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 165/200
 - 1s - loss: 0.0165 - acc: 0.9953 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 166/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 167/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 168/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 169/200
 - 1s - loss: 0.0168 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 170/200
 - 1s - loss: 0.0163 - acc: 0.9954 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 171/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 172/200
 - 1s - loss: 0.0164 - acc: 0.9946 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 173/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 174/200
 - 1s - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 175/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 176/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 177/200
 - 1s - loss: 0.0165 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 178/200
 - 1s - loss: 0.0165 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 179/200
 - 1s - loss: 0.0162 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 180/200
 - 1s - loss: 0.0162 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 181/200
 - 1s - loss: 0.0164 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 182/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 183/200
 - 1s - loss: 0.0168 - acc: 0.9947 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 184/200
 - 1s - loss: 0.0164 - acc: 0.9947 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 185/200
 - 1s - loss: 0.0166 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 186/200
 - 1s - loss: 0.0166 - acc: 0.9948 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 187/200
 - 1s - loss: 0.0166 - acc: 0.9952 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 188/200
 - 1s - loss: 0.0165 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 189/200
 - 1s - loss: 0.0164 - acc: 0.9951 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 190/200
 - 1s - loss: 0.0161 - acc: 0.9949 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 191/200
 - 1s - loss: 0.0163 - acc: 0.9952 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 192/200
 - 1s - loss: 0.0164 - acc: 0.9955 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 193/200
 - 1s - loss: 0.0159 - acc: 0.9956 - val_loss: 0.0123 - val_acc: 0.9966
Epoch 194/200
 - 1s - loss: 0.0169 - acc: 0.9946 - val_loss: 0.0122 - val_acc: 0.9966
Epoch 195/200
 - 1s - loss: 0.0166 - acc: 0.9950 - val_loss: 0.0122 - val_acc: 0.9966
Epoch 196/200
 - 1s - loss: 0.0164 - acc: 0.9950 - val_loss: 0.0122 - val_acc: 0.9966
Epoch 197/200
 - 1s - loss: 0.0166 - acc: 0.9947 - val_loss: 0.0122 - val_acc: 0.9966
Epoch 198/200
 - 1s - loss: 0.0162 - acc: 0.9952 - val_loss: 0.0122 - val_acc: 0.9966
Epoch 199/200
 - 1s - loss: 0.0165 - acc: 0.9950 - val_loss: 0.0122 - val_acc: 0.9966
Epoch 200/200
 - 1s - loss: 0.0161 - acc: 0.9950 - val_loss: 0.0122 - val_acc: 0.9966
2018-03-27 13:17:58,844 [INFO] Evaluate...
2018-03-27 13:18:04,350 [INFO] Done!
2018-03-27 13:18:04,356 [INFO] tpe_transform took 0.002501 seconds
2018-03-27 13:18:04,357 [INFO] TPE using 96/96 trials with best loss 0.011121
2018-03-27 13:18:04,365 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:18:05,357 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0623 - acc: 0.9788 - val_loss: 0.0317 - val_acc: 0.9886
Epoch 2/200
 - 1s - loss: 0.0302 - acc: 0.9909 - val_loss: 0.0292 - val_acc: 0.9886
Epoch 3/200
 - 1s - loss: 0.0275 - acc: 0.9921 - val_loss: 0.0283 - val_acc: 0.9888
Epoch 4/200
 - 1s - loss: 0.0274 - acc: 0.9915 - val_loss: 0.0277 - val_acc: 0.9890
Epoch 5/200
 - 1s - loss: 0.0273 - acc: 0.9920 - val_loss: 0.0273 - val_acc: 0.9892
Epoch 6/200
 - 1s - loss: 0.0263 - acc: 0.9922 - val_loss: 0.0270 - val_acc: 0.9894
Epoch 7/200
 - 1s - loss: 0.0258 - acc: 0.9930 - val_loss: 0.0267 - val_acc: 0.9890
Epoch 8/200
 - 1s - loss: 0.0251 - acc: 0.9926 - val_loss: 0.0265 - val_acc: 0.9894
Epoch 9/200
 - 1s - loss: 0.0249 - acc: 0.9922 - val_loss: 0.0263 - val_acc: 0.9896
Epoch 10/200
 - 1s - loss: 0.0257 - acc: 0.9925 - val_loss: 0.0262 - val_acc: 0.9896
Epoch 11/200
 - 1s - loss: 0.0248 - acc: 0.9925 - val_loss: 0.0261 - val_acc: 0.9896
Epoch 12/200
 - 1s - loss: 0.0251 - acc: 0.9922 - val_loss: 0.0259 - val_acc: 0.9898
Epoch 13/200
 - 1s - loss: 0.0245 - acc: 0.9927 - val_loss: 0.0258 - val_acc: 0.9896
Epoch 14/200
 - 1s - loss: 0.0237 - acc: 0.9928 - val_loss: 0.0257 - val_acc: 0.9898
Epoch 15/200
 - 1s - loss: 0.0239 - acc: 0.9933 - val_loss: 0.0257 - val_acc: 0.9898
Epoch 16/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0256 - val_acc: 0.9898
Epoch 17/200
 - 1s - loss: 0.0242 - acc: 0.9930 - val_loss: 0.0255 - val_acc: 0.9900
Epoch 18/200
 - 1s - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0254 - val_acc: 0.9902
Epoch 19/200
 - 1s - loss: 0.0248 - acc: 0.9926 - val_loss: 0.0254 - val_acc: 0.9900
Epoch 20/200
 - 1s - loss: 0.0242 - acc: 0.9921 - val_loss: 0.0253 - val_acc: 0.9902
Epoch 21/200
 - 1s - loss: 0.0235 - acc: 0.9927 - val_loss: 0.0253 - val_acc: 0.9900
Epoch 22/200
 - 1s - loss: 0.0238 - acc: 0.9926 - val_loss: 0.0252 - val_acc: 0.9902
Epoch 23/200
 - 1s - loss: 0.0240 - acc: 0.9927 - val_loss: 0.0252 - val_acc: 0.9902
Epoch 24/200
 - 1s - loss: 0.0234 - acc: 0.9930 - val_loss: 0.0251 - val_acc: 0.9902
Epoch 25/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0251 - val_acc: 0.9902
Epoch 26/200
 - 1s - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0250 - val_acc: 0.9904
Epoch 27/200
 - 1s - loss: 0.0229 - acc: 0.9938 - val_loss: 0.0250 - val_acc: 0.9902
Epoch 28/200
 - 1s - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0250 - val_acc: 0.9902
Epoch 29/200
 - 1s - loss: 0.0233 - acc: 0.9933 - val_loss: 0.0249 - val_acc: 0.9902
Epoch 30/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0249 - val_acc: 0.9902
Epoch 31/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0249 - val_acc: 0.9904
Epoch 32/200
 - 1s - loss: 0.0235 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 33/200
 - 1s - loss: 0.0228 - acc: 0.9930 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 34/200
 - 1s - loss: 0.0236 - acc: 0.9925 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 35/200
 - 1s - loss: 0.0230 - acc: 0.9926 - val_loss: 0.0248 - val_acc: 0.9904
Epoch 36/200
 - 1s - loss: 0.0233 - acc: 0.9932 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 37/200
 - 1s - loss: 0.0232 - acc: 0.9934 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 38/200
 - 1s - loss: 0.0232 - acc: 0.9928 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 39/200
 - 1s - loss: 0.0229 - acc: 0.9940 - val_loss: 0.0247 - val_acc: 0.9904
Epoch 40/200
 - 1s - loss: 0.0230 - acc: 0.9929 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 41/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 42/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 43/200
 - 1s - loss: 0.0230 - acc: 0.9932 - val_loss: 0.0246 - val_acc: 0.9904
Epoch 44/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 45/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 46/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 47/200
 - 1s - loss: 0.0220 - acc: 0.9937 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 48/200
 - 1s - loss: 0.0229 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9904
Epoch 49/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 50/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 51/200
 - 1s - loss: 0.0229 - acc: 0.9926 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 52/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 53/200
 - 1s - loss: 0.0225 - acc: 0.9933 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 54/200
 - 1s - loss: 0.0222 - acc: 0.9932 - val_loss: 0.0244 - val_acc: 0.9904
Epoch 55/200
 - 1s - loss: 0.0225 - acc: 0.9938 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 56/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 57/200
 - 1s - loss: 0.0228 - acc: 0.9928 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 58/200
 - 1s - loss: 0.0222 - acc: 0.9942 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 59/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 60/200
 - 1s - loss: 0.0222 - acc: 0.9937 - val_loss: 0.0243 - val_acc: 0.9904
Epoch 61/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0243 - val_acc: 0.9906
Epoch 62/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 63/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 64/200
 - 1s - loss: 0.0229 - acc: 0.9927 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 65/200
 - 1s - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 66/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 67/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 68/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 69/200
 - 1s - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 70/200
 - 1s - loss: 0.0224 - acc: 0.9940 - val_loss: 0.0242 - val_acc: 0.9906
Epoch 71/200
 - 1s - loss: 0.0222 - acc: 0.9934 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 72/200
 - 1s - loss: 0.0223 - acc: 0.9932 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 73/200
 - 1s - loss: 0.0221 - acc: 0.9934 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 74/200
 - 1s - loss: 0.0217 - acc: 0.9935 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 75/200
 - 1s - loss: 0.0220 - acc: 0.9936 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 76/200
 - 1s - loss: 0.0233 - acc: 0.9927 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 77/200
 - 1s - loss: 0.0217 - acc: 0.9934 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 78/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 79/200
 - 1s - loss: 0.0218 - acc: 0.9939 - val_loss: 0.0241 - val_acc: 0.9906
Epoch 80/200
 - 1s - loss: 0.0222 - acc: 0.9935 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 81/200
 - 1s - loss: 0.0218 - acc: 0.9939 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 82/200
 - 1s - loss: 0.0226 - acc: 0.9930 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 83/200
 - 1s - loss: 0.0216 - acc: 0.9933 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 84/200
 - 1s - loss: 0.0216 - acc: 0.9936 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 85/200
 - 1s - loss: 0.0219 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 86/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 87/200
 - 1s - loss: 0.0221 - acc: 0.9939 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 88/200
 - 1s - loss: 0.0216 - acc: 0.9940 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 89/200
 - 1s - loss: 0.0217 - acc: 0.9933 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 90/200
 - 1s - loss: 0.0216 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9906
Epoch 91/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 92/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 93/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 94/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 95/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 96/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 97/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 98/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 99/200
 - 1s - loss: 0.0219 - acc: 0.9934 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 100/200
 - 1s - loss: 0.0217 - acc: 0.9936 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 101/200
 - 1s - loss: 0.0214 - acc: 0.9939 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 102/200
 - 1s - loss: 0.0222 - acc: 0.9937 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 103/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 104/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0239 - val_acc: 0.9906
Epoch 105/200
 - 1s - loss: 0.0217 - acc: 0.9934 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 106/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 107/200
 - 1s - loss: 0.0215 - acc: 0.9935 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 108/200
 - 1s - loss: 0.0210 - acc: 0.9938 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 109/200
 - 1s - loss: 0.0221 - acc: 0.9930 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 110/200
 - 1s - loss: 0.0222 - acc: 0.9940 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 111/200
 - 1s - loss: 0.0217 - acc: 0.9937 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 112/200
 - 1s - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 113/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 114/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 115/200
 - 1s - loss: 0.0219 - acc: 0.9936 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 116/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 117/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 118/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 119/200
 - 1s - loss: 0.0220 - acc: 0.9933 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 120/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0238 - val_acc: 0.9906
Epoch 121/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 122/200
 - 1s - loss: 0.0214 - acc: 0.9937 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 123/200
 - 1s - loss: 0.0218 - acc: 0.9930 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 124/200
 - 1s - loss: 0.0216 - acc: 0.9938 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 125/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 126/200
 - 1s - loss: 0.0225 - acc: 0.9937 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 127/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 128/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 129/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 130/200
 - 1s - loss: 0.0212 - acc: 0.9936 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 131/200
 - 1s - loss: 0.0211 - acc: 0.9940 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 132/200
 - 1s - loss: 0.0218 - acc: 0.9937 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 133/200
 - 1s - loss: 0.0211 - acc: 0.9938 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 134/200
 - 1s - loss: 0.0216 - acc: 0.9935 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 135/200
 - 1s - loss: 0.0215 - acc: 0.9939 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 136/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 137/200
 - 1s - loss: 0.0218 - acc: 0.9932 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 138/200
 - 1s - loss: 0.0217 - acc: 0.9938 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 139/200
 - 1s - loss: 0.0216 - acc: 0.9941 - val_loss: 0.0237 - val_acc: 0.9906
Epoch 140/200
 - 1s - loss: 0.0217 - acc: 0.9941 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 141/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 142/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 143/200
 - 1s - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 144/200
 - 1s - loss: 0.0209 - acc: 0.9945 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 145/200
 - 1s - loss: 0.0213 - acc: 0.9934 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 146/200
 - 1s - loss: 0.0217 - acc: 0.9940 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 147/200
 - 1s - loss: 0.0215 - acc: 0.9933 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 148/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 149/200
 - 1s - loss: 0.0213 - acc: 0.9932 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 150/200
 - 1s - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 151/200
 - 1s - loss: 0.0210 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 152/200
 - 1s - loss: 0.0213 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 153/200
 - 1s - loss: 0.0221 - acc: 0.9938 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 154/200
 - 1s - loss: 0.0218 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 155/200
 - 1s - loss: 0.0215 - acc: 0.9941 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 156/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 157/200
 - 1s - loss: 0.0214 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9906
Epoch 158/200
 - 1s - loss: 0.0210 - acc: 0.9940 - val_loss: 0.0236 - val_acc: 0.9904
Epoch 159/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9904
Epoch 160/200
 - 1s - loss: 0.0212 - acc: 0.9939 - val_loss: 0.0236 - val_acc: 0.9904
Epoch 161/200
 - 1s - loss: 0.0207 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9904
Epoch 162/200
 - 1s - loss: 0.0210 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 163/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 164/200
 - 1s - loss: 0.0221 - acc: 0.9934 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 165/200
 - 1s - loss: 0.0214 - acc: 0.9933 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 166/200
 - 1s - loss: 0.0207 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 167/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 168/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 169/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 170/200
 - 1s - loss: 0.0218 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 171/200
 - 1s - loss: 0.0212 - acc: 0.9940 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 172/200
 - 1s - loss: 0.0210 - acc: 0.9939 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 173/200
 - 1s - loss: 0.0209 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 174/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 175/200
 - 1s - loss: 0.0215 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 176/200
 - 1s - loss: 0.0209 - acc: 0.9941 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 177/200
 - 1s - loss: 0.0218 - acc: 0.9933 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 178/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 179/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 180/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 181/200
 - 1s - loss: 0.0215 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 182/200
 - 1s - loss: 0.0209 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 183/200
 - 1s - loss: 0.0210 - acc: 0.9934 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 184/200
 - 1s - loss: 0.0220 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 185/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 186/200
 - 1s - loss: 0.0208 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 187/200
 - 1s - loss: 0.0208 - acc: 0.9940 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 188/200
 - 1s - loss: 0.0211 - acc: 0.9939 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 189/200
 - 1s - loss: 0.0215 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9904
Epoch 190/200
 - 1s - loss: 0.0204 - acc: 0.9940 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 191/200
 - 1s - loss: 0.0215 - acc: 0.9932 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 192/200
 - 1s - loss: 0.0214 - acc: 0.9938 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 193/200
 - 1s - loss: 0.0207 - acc: 0.9944 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 194/200
 - 1s - loss: 0.0205 - acc: 0.9946 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 195/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 196/200
 - 1s - loss: 0.0217 - acc: 0.9932 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 197/200
 - 1s - loss: 0.0209 - acc: 0.9940 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 198/200
 - 1s - loss: 0.0212 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 199/200
 - 1s - loss: 0.0216 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9904
Epoch 200/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0234 - val_acc: 0.9904
2018-03-27 13:21:34,036 [INFO] Evaluate...
2018-03-27 13:21:39,629 [INFO] Done!
2018-03-27 13:21:39,636 [INFO] tpe_transform took 0.003266 seconds
2018-03-27 13:21:39,637 [INFO] TPE using 97/97 trials with best loss 0.011121
2018-03-27 13:21:39,645 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:21:40,633 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0514 - acc: 0.9801 - val_loss: 0.0210 - val_acc: 0.9920
Epoch 2/200
 - 1s - loss: 0.0271 - acc: 0.9909 - val_loss: 0.0193 - val_acc: 0.9932
Epoch 3/200
 - 1s - loss: 0.0256 - acc: 0.9912 - val_loss: 0.0185 - val_acc: 0.9932
Epoch 4/200
 - 1s - loss: 0.0241 - acc: 0.9920 - val_loss: 0.0182 - val_acc: 0.9932
Epoch 5/200
 - 1s - loss: 0.0238 - acc: 0.9922 - val_loss: 0.0178 - val_acc: 0.9936
Epoch 6/200
 - 1s - loss: 0.0242 - acc: 0.9927 - val_loss: 0.0175 - val_acc: 0.9934
Epoch 7/200
 - 1s - loss: 0.0234 - acc: 0.9927 - val_loss: 0.0173 - val_acc: 0.9934
Epoch 8/200
 - 1s - loss: 0.0237 - acc: 0.9924 - val_loss: 0.0171 - val_acc: 0.9938
Epoch 9/200
 - 1s - loss: 0.0238 - acc: 0.9928 - val_loss: 0.0170 - val_acc: 0.9936
Epoch 10/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0168 - val_acc: 0.9934
Epoch 11/200
 - 1s - loss: 0.0220 - acc: 0.9926 - val_loss: 0.0167 - val_acc: 0.9934
Epoch 12/200
 - 1s - loss: 0.0234 - acc: 0.9929 - val_loss: 0.0167 - val_acc: 0.9936
Epoch 13/200
 - 1s - loss: 0.0228 - acc: 0.9924 - val_loss: 0.0166 - val_acc: 0.9934
Epoch 14/200
 - 1s - loss: 0.0223 - acc: 0.9928 - val_loss: 0.0165 - val_acc: 0.9934
Epoch 15/200
 - 1s - loss: 0.0227 - acc: 0.9928 - val_loss: 0.0165 - val_acc: 0.9934
Epoch 16/200
 - 1s - loss: 0.0215 - acc: 0.9929 - val_loss: 0.0164 - val_acc: 0.9936
Epoch 17/200
 - 1s - loss: 0.0222 - acc: 0.9929 - val_loss: 0.0164 - val_acc: 0.9936
Epoch 18/200
 - 1s - loss: 0.0220 - acc: 0.9926 - val_loss: 0.0163 - val_acc: 0.9936
Epoch 19/200
 - 1s - loss: 0.0220 - acc: 0.9924 - val_loss: 0.0163 - val_acc: 0.9936
Epoch 20/200
 - 1s - loss: 0.0220 - acc: 0.9929 - val_loss: 0.0163 - val_acc: 0.9936
Epoch 21/200
 - 1s - loss: 0.0219 - acc: 0.9931 - val_loss: 0.0163 - val_acc: 0.9938
Epoch 22/200
 - 1s - loss: 0.0217 - acc: 0.9928 - val_loss: 0.0162 - val_acc: 0.9936
Epoch 23/200
 - 1s - loss: 0.0206 - acc: 0.9935 - val_loss: 0.0162 - val_acc: 0.9938
Epoch 24/200
 - 1s - loss: 0.0226 - acc: 0.9929 - val_loss: 0.0162 - val_acc: 0.9936
Epoch 25/200
 - 1s - loss: 0.0212 - acc: 0.9938 - val_loss: 0.0161 - val_acc: 0.9938
Epoch 26/200
 - 1s - loss: 0.0213 - acc: 0.9929 - val_loss: 0.0161 - val_acc: 0.9938
Epoch 27/200
 - 1s - loss: 0.0220 - acc: 0.9931 - val_loss: 0.0161 - val_acc: 0.9938
Epoch 28/200
 - 1s - loss: 0.0220 - acc: 0.9926 - val_loss: 0.0161 - val_acc: 0.9938
Epoch 29/200
 - 1s - loss: 0.0215 - acc: 0.9934 - val_loss: 0.0160 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0214 - acc: 0.9929 - val_loss: 0.0160 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0202 - acc: 0.9933 - val_loss: 0.0160 - val_acc: 0.9938
Epoch 32/200
 - 1s - loss: 0.0206 - acc: 0.9940 - val_loss: 0.0160 - val_acc: 0.9938
Epoch 33/200
 - 1s - loss: 0.0210 - acc: 0.9935 - val_loss: 0.0159 - val_acc: 0.9936
Epoch 34/200
 - 1s - loss: 0.0213 - acc: 0.9926 - val_loss: 0.0159 - val_acc: 0.9936
Epoch 35/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0159 - val_acc: 0.9936
Epoch 36/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0159 - val_acc: 0.9938
Epoch 37/200
 - 1s - loss: 0.0211 - acc: 0.9936 - val_loss: 0.0159 - val_acc: 0.9938
Epoch 38/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0159 - val_acc: 0.9938
Epoch 39/200
 - 1s - loss: 0.0206 - acc: 0.9930 - val_loss: 0.0159 - val_acc: 0.9938
Epoch 40/200
 - 1s - loss: 0.0207 - acc: 0.9930 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 41/200
 - 1s - loss: 0.0212 - acc: 0.9932 - val_loss: 0.0158 - val_acc: 0.9936
Epoch 42/200
 - 1s - loss: 0.0194 - acc: 0.9931 - val_loss: 0.0158 - val_acc: 0.9936
Epoch 43/200
 - 1s - loss: 0.0209 - acc: 0.9934 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 44/200
 - 1s - loss: 0.0214 - acc: 0.9932 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 45/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 46/200
 - 1s - loss: 0.0206 - acc: 0.9936 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 47/200
 - 1s - loss: 0.0212 - acc: 0.9933 - val_loss: 0.0158 - val_acc: 0.9936
Epoch 48/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0158 - val_acc: 0.9938
Epoch 49/200
 - 1s - loss: 0.0214 - acc: 0.9935 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 50/200
 - 1s - loss: 0.0212 - acc: 0.9935 - val_loss: 0.0157 - val_acc: 0.9938
Epoch 51/200
 - 1s - loss: 0.0206 - acc: 0.9929 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 52/200
 - 1s - loss: 0.0213 - acc: 0.9931 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 53/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 54/200
 - 1s - loss: 0.0211 - acc: 0.9931 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 55/200
 - 1s - loss: 0.0198 - acc: 0.9931 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 56/200
 - 1s - loss: 0.0219 - acc: 0.9928 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 57/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0157 - val_acc: 0.9936
Epoch 58/200
 - 1s - loss: 0.0220 - acc: 0.9927 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 59/200
 - 1s - loss: 0.0207 - acc: 0.9932 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 60/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 61/200
 - 1s - loss: 0.0198 - acc: 0.9934 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 62/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 63/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 64/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 65/200
 - 1s - loss: 0.0213 - acc: 0.9928 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 66/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 67/200
 - 1s - loss: 0.0200 - acc: 0.9928 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 68/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 69/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 70/200
 - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 0.0156 - val_acc: 0.9936
Epoch 71/200
 - 1s - loss: 0.0191 - acc: 0.9942 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 72/200
 - 1s - loss: 0.0204 - acc: 0.9935 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 73/200
 - 1s - loss: 0.0201 - acc: 0.9938 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 74/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 75/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 76/200
 - 1s - loss: 0.0202 - acc: 0.9931 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 77/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 78/200
 - 1s - loss: 0.0202 - acc: 0.9935 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 79/200
 - 1s - loss: 0.0210 - acc: 0.9927 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 80/200
 - 1s - loss: 0.0209 - acc: 0.9935 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 81/200
 - 1s - loss: 0.0214 - acc: 0.9931 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 82/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 83/200
 - 1s - loss: 0.0206 - acc: 0.9937 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 84/200
 - 1s - loss: 0.0196 - acc: 0.9940 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 85/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 86/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 87/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0155 - val_acc: 0.9936
Epoch 88/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 89/200
 - 1s - loss: 0.0208 - acc: 0.9937 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 90/200
 - 1s - loss: 0.0205 - acc: 0.9933 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 91/200
 - 1s - loss: 0.0212 - acc: 0.9931 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 92/200
 - 1s - loss: 0.0195 - acc: 0.9937 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 93/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 94/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 95/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 96/200
 - 1s - loss: 0.0205 - acc: 0.9936 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 97/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 98/200
 - 1s - loss: 0.0196 - acc: 0.9939 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 99/200
 - 1s - loss: 0.0205 - acc: 0.9935 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 100/200
 - 1s - loss: 0.0204 - acc: 0.9933 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 101/200
 - 1s - loss: 0.0199 - acc: 0.9933 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 102/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 103/200
 - 1s - loss: 0.0204 - acc: 0.9937 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 104/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0154 - val_acc: 0.9936
Epoch 105/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 106/200
 - 1s - loss: 0.0196 - acc: 0.9932 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 107/200
 - 1s - loss: 0.0195 - acc: 0.9936 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 108/200
 - 1s - loss: 0.0199 - acc: 0.9934 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 109/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0154 - val_acc: 0.9938
Epoch 110/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 111/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 112/200
 - 1s - loss: 0.0192 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 113/200
 - 1s - loss: 0.0207 - acc: 0.9928 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 114/200
 - 1s - loss: 0.0198 - acc: 0.9940 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 115/200
 - 1s - loss: 0.0199 - acc: 0.9935 - val_loss: 0.0153 - val_acc: 0.9936
Epoch 116/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 117/200
 - 1s - loss: 0.0208 - acc: 0.9931 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 118/200
 - 1s - loss: 0.0195 - acc: 0.9941 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 119/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 120/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 121/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 122/200
 - 1s - loss: 0.0201 - acc: 0.9936 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 123/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 124/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 125/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 126/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 127/200
 - 1s - loss: 0.0191 - acc: 0.9936 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 128/200
 - 1s - loss: 0.0201 - acc: 0.9940 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 129/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 130/200
 - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 131/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0153 - val_acc: 0.9938
Epoch 132/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 133/200
 - 1s - loss: 0.0196 - acc: 0.9939 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 134/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 135/200
 - 1s - loss: 0.0209 - acc: 0.9933 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 136/200
 - 1s - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 137/200
 - 1s - loss: 0.0202 - acc: 0.9934 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 138/200
 - 1s - loss: 0.0205 - acc: 0.9932 - val_loss: 0.0153 - val_acc: 0.9940
Epoch 139/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 140/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 141/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 142/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 143/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 144/200
 - 1s - loss: 0.0201 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 145/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 146/200
 - 1s - loss: 0.0200 - acc: 0.9933 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 147/200
 - 1s - loss: 0.0188 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 148/200
 - 1s - loss: 0.0203 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 149/200
 - 1s - loss: 0.0200 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 150/200
 - 1s - loss: 0.0184 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 151/200
 - 1s - loss: 0.0189 - acc: 0.9936 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 152/200
 - 1s - loss: 0.0187 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 153/200
 - 1s - loss: 0.0196 - acc: 0.9935 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 154/200
 - 1s - loss: 0.0204 - acc: 0.9936 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 155/200
 - 1s - loss: 0.0199 - acc: 0.9942 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 156/200
 - 1s - loss: 0.0186 - acc: 0.9945 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 157/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 158/200
 - 1s - loss: 0.0198 - acc: 0.9929 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 159/200
 - 1s - loss: 0.0196 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 160/200
 - 1s - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 161/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 162/200
 - 1s - loss: 0.0195 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 163/200
 - 1s - loss: 0.0189 - acc: 0.9935 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 164/200
 - 1s - loss: 0.0192 - acc: 0.9935 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 165/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 166/200
 - 1s - loss: 0.0203 - acc: 0.9937 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 167/200
 - 1s - loss: 0.0202 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 168/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 169/200
 - 1s - loss: 0.0191 - acc: 0.9940 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 170/200
 - 1s - loss: 0.0198 - acc: 0.9936 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 171/200
 - 1s - loss: 0.0191 - acc: 0.9929 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 172/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 173/200
 - 1s - loss: 0.0200 - acc: 0.9938 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 174/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0152 - val_acc: 0.9940
Epoch 175/200
 - 1s - loss: 0.0197 - acc: 0.9939 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 176/200
 - 1s - loss: 0.0197 - acc: 0.9936 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 177/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 178/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 179/200
 - 1s - loss: 0.0198 - acc: 0.9935 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 180/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 181/200
 - 1s - loss: 0.0198 - acc: 0.9933 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 182/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 183/200
 - 1s - loss: 0.0184 - acc: 0.9945 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 184/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 185/200
 - 1s - loss: 0.0193 - acc: 0.9939 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 186/200
 - 1s - loss: 0.0192 - acc: 0.9938 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 187/200
 - 1s - loss: 0.0199 - acc: 0.9935 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 188/200
 - 1s - loss: 0.0199 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 189/200
 - 1s - loss: 0.0206 - acc: 0.9931 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 190/200
 - 1s - loss: 0.0196 - acc: 0.9943 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 191/200
 - 1s - loss: 0.0197 - acc: 0.9940 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 192/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 193/200
 - 1s - loss: 0.0196 - acc: 0.9938 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 194/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 195/200
 - 1s - loss: 0.0188 - acc: 0.9944 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 196/200
 - 1s - loss: 0.0204 - acc: 0.9932 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 197/200
 - 1s - loss: 0.0190 - acc: 0.9941 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 198/200
 - 1s - loss: 0.0194 - acc: 0.9935 - val_loss: 0.0151 - val_acc: 0.9938
Epoch 199/200
 - 1s - loss: 0.0197 - acc: 0.9937 - val_loss: 0.0151 - val_acc: 0.9940
Epoch 200/200
 - 1s - loss: 0.0203 - acc: 0.9940 - val_loss: 0.0151 - val_acc: 0.9938
2018-03-27 13:25:10,237 [INFO] Evaluate...
2018-03-27 13:25:15,851 [INFO] Done!
2018-03-27 13:25:15,858 [INFO] tpe_transform took 0.002444 seconds
2018-03-27 13:25:15,859 [INFO] TPE using 98/98 trials with best loss 0.011121
2018-03-27 13:25:15,867 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:25:16,854 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0825 - acc: 0.9754 - val_loss: 0.0417 - val_acc: 0.9916
Epoch 2/200
 - 1s - loss: 0.0390 - acc: 0.9905 - val_loss: 0.0358 - val_acc: 0.9920
Epoch 3/200
 - 1s - loss: 0.0345 - acc: 0.9911 - val_loss: 0.0333 - val_acc: 0.9928
Epoch 4/200
 - 1s - loss: 0.0326 - acc: 0.9916 - val_loss: 0.0318 - val_acc: 0.9928
Epoch 5/200
 - 1s - loss: 0.0313 - acc: 0.9918 - val_loss: 0.0309 - val_acc: 0.9928
Epoch 6/200
 - 1s - loss: 0.0303 - acc: 0.9921 - val_loss: 0.0302 - val_acc: 0.9928
Epoch 7/200
 - 1s - loss: 0.0297 - acc: 0.9915 - val_loss: 0.0296 - val_acc: 0.9926
Epoch 8/200
 - 1s - loss: 0.0296 - acc: 0.9923 - val_loss: 0.0292 - val_acc: 0.9926
Epoch 9/200
 - 1s - loss: 0.0285 - acc: 0.9924 - val_loss: 0.0288 - val_acc: 0.9926
Epoch 10/200
 - 1s - loss: 0.0284 - acc: 0.9919 - val_loss: 0.0285 - val_acc: 0.9926
Epoch 11/200
 - 1s - loss: 0.0277 - acc: 0.9927 - val_loss: 0.0282 - val_acc: 0.9926
Epoch 12/200
 - 1s - loss: 0.0273 - acc: 0.9929 - val_loss: 0.0280 - val_acc: 0.9926
Epoch 13/200
 - 1s - loss: 0.0277 - acc: 0.9924 - val_loss: 0.0278 - val_acc: 0.9926
Epoch 14/200
 - 1s - loss: 0.0276 - acc: 0.9922 - val_loss: 0.0276 - val_acc: 0.9926
Epoch 15/200
 - 1s - loss: 0.0267 - acc: 0.9929 - val_loss: 0.0274 - val_acc: 0.9926
Epoch 16/200
 - 1s - loss: 0.0268 - acc: 0.9926 - val_loss: 0.0272 - val_acc: 0.9926
Epoch 17/200
 - 1s - loss: 0.0269 - acc: 0.9927 - val_loss: 0.0271 - val_acc: 0.9926
Epoch 18/200
 - 1s - loss: 0.0267 - acc: 0.9923 - val_loss: 0.0270 - val_acc: 0.9926
Epoch 19/200
 - 1s - loss: 0.0263 - acc: 0.9927 - val_loss: 0.0268 - val_acc: 0.9928
Epoch 20/200
 - 1s - loss: 0.0262 - acc: 0.9927 - val_loss: 0.0267 - val_acc: 0.9928
Epoch 21/200
 - 1s - loss: 0.0260 - acc: 0.9933 - val_loss: 0.0266 - val_acc: 0.9928
Epoch 22/200
 - 1s - loss: 0.0261 - acc: 0.9928 - val_loss: 0.0265 - val_acc: 0.9928
Epoch 23/200
 - 1s - loss: 0.0257 - acc: 0.9930 - val_loss: 0.0264 - val_acc: 0.9926
Epoch 24/200
 - 1s - loss: 0.0257 - acc: 0.9929 - val_loss: 0.0263 - val_acc: 0.9926
Epoch 25/200
 - 1s - loss: 0.0256 - acc: 0.9931 - val_loss: 0.0263 - val_acc: 0.9926
Epoch 26/200
 - 1s - loss: 0.0256 - acc: 0.9929 - val_loss: 0.0262 - val_acc: 0.9926
Epoch 27/200
 - 1s - loss: 0.0255 - acc: 0.9931 - val_loss: 0.0261 - val_acc: 0.9926
Epoch 28/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0260 - val_acc: 0.9926
Epoch 29/200
 - 1s - loss: 0.0254 - acc: 0.9929 - val_loss: 0.0260 - val_acc: 0.9926
Epoch 30/200
 - 1s - loss: 0.0252 - acc: 0.9931 - val_loss: 0.0259 - val_acc: 0.9926
Epoch 31/200
 - 1s - loss: 0.0252 - acc: 0.9930 - val_loss: 0.0259 - val_acc: 0.9926
Epoch 32/200
 - 1s - loss: 0.0252 - acc: 0.9929 - val_loss: 0.0258 - val_acc: 0.9926
Epoch 33/200
 - 1s - loss: 0.0251 - acc: 0.9929 - val_loss: 0.0257 - val_acc: 0.9926
Epoch 34/200
 - 1s - loss: 0.0250 - acc: 0.9930 - val_loss: 0.0257 - val_acc: 0.9926
Epoch 35/200
 - 1s - loss: 0.0249 - acc: 0.9929 - val_loss: 0.0256 - val_acc: 0.9928
Epoch 36/200
 - 1s - loss: 0.0248 - acc: 0.9931 - val_loss: 0.0256 - val_acc: 0.9926
Epoch 37/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0255 - val_acc: 0.9926
Epoch 38/200
 - 1s - loss: 0.0249 - acc: 0.9932 - val_loss: 0.0255 - val_acc: 0.9926
Epoch 39/200
 - 1s - loss: 0.0247 - acc: 0.9932 - val_loss: 0.0255 - val_acc: 0.9926
Epoch 40/200
 - 1s - loss: 0.0246 - acc: 0.9932 - val_loss: 0.0254 - val_acc: 0.9926
Epoch 41/200
 - 1s - loss: 0.0243 - acc: 0.9934 - val_loss: 0.0254 - val_acc: 0.9926
Epoch 42/200
 - 1s - loss: 0.0245 - acc: 0.9935 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 43/200
 - 1s - loss: 0.0248 - acc: 0.9928 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 44/200
 - 1s - loss: 0.0247 - acc: 0.9930 - val_loss: 0.0253 - val_acc: 0.9926
Epoch 45/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 46/200
 - 1s - loss: 0.0243 - acc: 0.9930 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 47/200
 - 1s - loss: 0.0242 - acc: 0.9935 - val_loss: 0.0252 - val_acc: 0.9926
Epoch 48/200
 - 1s - loss: 0.0246 - acc: 0.9931 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 49/200
 - 1s - loss: 0.0239 - acc: 0.9931 - val_loss: 0.0251 - val_acc: 0.9926
Epoch 50/200
 - 1s - loss: 0.0241 - acc: 0.9931 - val_loss: 0.0251 - val_acc: 0.9924
Epoch 51/200
 - 1s - loss: 0.0244 - acc: 0.9929 - val_loss: 0.0250 - val_acc: 0.9924
Epoch 52/200
 - 1s - loss: 0.0241 - acc: 0.9932 - val_loss: 0.0250 - val_acc: 0.9924
Epoch 53/200
 - 1s - loss: 0.0238 - acc: 0.9934 - val_loss: 0.0250 - val_acc: 0.9924
Epoch 54/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0249 - val_acc: 0.9926
Epoch 55/200
 - 1s - loss: 0.0243 - acc: 0.9932 - val_loss: 0.0249 - val_acc: 0.9926
Epoch 56/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9926
Epoch 57/200
 - 1s - loss: 0.0244 - acc: 0.9931 - val_loss: 0.0249 - val_acc: 0.9926
Epoch 58/200
 - 1s - loss: 0.0240 - acc: 0.9931 - val_loss: 0.0248 - val_acc: 0.9924
Epoch 59/200
 - 1s - loss: 0.0242 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9926
Epoch 60/200
 - 1s - loss: 0.0236 - acc: 0.9932 - val_loss: 0.0248 - val_acc: 0.9926
Epoch 61/200
 - 1s - loss: 0.0241 - acc: 0.9929 - val_loss: 0.0248 - val_acc: 0.9926
Epoch 62/200
 - 1s - loss: 0.0242 - acc: 0.9928 - val_loss: 0.0247 - val_acc: 0.9926
Epoch 63/200
 - 1s - loss: 0.0240 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9926
Epoch 64/200
 - 1s - loss: 0.0238 - acc: 0.9935 - val_loss: 0.0247 - val_acc: 0.9926
Epoch 65/200
 - 1s - loss: 0.0238 - acc: 0.9929 - val_loss: 0.0247 - val_acc: 0.9926
Epoch 66/200
 - 1s - loss: 0.0231 - acc: 0.9937 - val_loss: 0.0246 - val_acc: 0.9926
Epoch 67/200
 - 1s - loss: 0.0237 - acc: 0.9933 - val_loss: 0.0246 - val_acc: 0.9926
Epoch 68/200
 - 1s - loss: 0.0236 - acc: 0.9933 - val_loss: 0.0246 - val_acc: 0.9926
Epoch 69/200
 - 1s - loss: 0.0234 - acc: 0.9932 - val_loss: 0.0246 - val_acc: 0.9926
Epoch 70/200
 - 1s - loss: 0.0236 - acc: 0.9937 - val_loss: 0.0246 - val_acc: 0.9926
Epoch 71/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0245 - val_acc: 0.9926
Epoch 72/200
 - 1s - loss: 0.0235 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9926
Epoch 73/200
 - 1s - loss: 0.0235 - acc: 0.9933 - val_loss: 0.0245 - val_acc: 0.9926
Epoch 74/200
 - 1s - loss: 0.0238 - acc: 0.9931 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 75/200
 - 1s - loss: 0.0232 - acc: 0.9937 - val_loss: 0.0245 - val_acc: 0.9928
Epoch 76/200
 - 1s - loss: 0.0236 - acc: 0.9934 - val_loss: 0.0244 - val_acc: 0.9926
Epoch 77/200
 - 1s - loss: 0.0236 - acc: 0.9929 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 78/200
 - 1s - loss: 0.0237 - acc: 0.9935 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 79/200
 - 1s - loss: 0.0235 - acc: 0.9936 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 80/200
 - 1s - loss: 0.0234 - acc: 0.9937 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 81/200
 - 1s - loss: 0.0230 - acc: 0.9933 - val_loss: 0.0244 - val_acc: 0.9928
Epoch 82/200
 - 1s - loss: 0.0233 - acc: 0.9936 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 83/200
 - 1s - loss: 0.0235 - acc: 0.9935 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 84/200
 - 1s - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 85/200
 - 1s - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 86/200
 - 1s - loss: 0.0235 - acc: 0.9932 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 87/200
 - 1s - loss: 0.0234 - acc: 0.9936 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 88/200
 - 1s - loss: 0.0232 - acc: 0.9935 - val_loss: 0.0243 - val_acc: 0.9930
Epoch 89/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 90/200
 - 1s - loss: 0.0230 - acc: 0.9934 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 91/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 92/200
 - 1s - loss: 0.0231 - acc: 0.9934 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 93/200
 - 1s - loss: 0.0231 - acc: 0.9936 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 94/200
 - 1s - loss: 0.0231 - acc: 0.9929 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 95/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0242 - val_acc: 0.9930
Epoch 96/200
 - 1s - loss: 0.0231 - acc: 0.9932 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 97/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 98/200
 - 1s - loss: 0.0232 - acc: 0.9933 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 99/200
 - 1s - loss: 0.0230 - acc: 0.9937 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 100/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 101/200
 - 1s - loss: 0.0234 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 102/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 103/200
 - 1s - loss: 0.0232 - acc: 0.9931 - val_loss: 0.0241 - val_acc: 0.9930
Epoch 104/200
 - 1s - loss: 0.0232 - acc: 0.9934 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 105/200
 - 1s - loss: 0.0227 - acc: 0.9936 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 106/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 107/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 108/200
 - 1s - loss: 0.0228 - acc: 0.9936 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 109/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 110/200
 - 1s - loss: 0.0229 - acc: 0.9937 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 111/200
 - 1s - loss: 0.0231 - acc: 0.9932 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 112/200
 - 1s - loss: 0.0231 - acc: 0.9931 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 113/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0240 - val_acc: 0.9930
Epoch 114/200
 - 1s - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 115/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 116/200
 - 1s - loss: 0.0226 - acc: 0.9938 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 117/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 118/200
 - 1s - loss: 0.0228 - acc: 0.9935 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 119/200
 - 1s - loss: 0.0229 - acc: 0.9931 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 120/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 121/200
 - 1s - loss: 0.0231 - acc: 0.9930 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 122/200
 - 1s - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 123/200
 - 1s - loss: 0.0227 - acc: 0.9933 - val_loss: 0.0239 - val_acc: 0.9930
Epoch 124/200
 - 1s - loss: 0.0226 - acc: 0.9932 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 125/200
 - 1s - loss: 0.0229 - acc: 0.9935 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 126/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 127/200
 - 1s - loss: 0.0230 - acc: 0.9931 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 128/200
 - 1s - loss: 0.0232 - acc: 0.9932 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 129/200
 - 1s - loss: 0.0228 - acc: 0.9938 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 130/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 131/200
 - 1s - loss: 0.0227 - acc: 0.9935 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 132/200
 - 1s - loss: 0.0230 - acc: 0.9935 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 133/200
 - 1s - loss: 0.0228 - acc: 0.9931 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 134/200
 - 1s - loss: 0.0228 - acc: 0.9930 - val_loss: 0.0238 - val_acc: 0.9930
Epoch 135/200
 - 1s - loss: 0.0230 - acc: 0.9936 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 136/200
 - 1s - loss: 0.0225 - acc: 0.9931 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 137/200
 - 1s - loss: 0.0226 - acc: 0.9933 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 138/200
 - 1s - loss: 0.0225 - acc: 0.9932 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 139/200
 - 1s - loss: 0.0226 - acc: 0.9940 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 140/200
 - 1s - loss: 0.0228 - acc: 0.9937 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 141/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 142/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 143/200
 - 1s - loss: 0.0226 - acc: 0.9941 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 144/200
 - 1s - loss: 0.0228 - acc: 0.9933 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 145/200
 - 1s - loss: 0.0226 - acc: 0.9936 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 146/200
 - 1s - loss: 0.0224 - acc: 0.9938 - val_loss: 0.0237 - val_acc: 0.9930
Epoch 147/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 148/200
 - 1s - loss: 0.0225 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 149/200
 - 1s - loss: 0.0222 - acc: 0.9933 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 150/200
 - 1s - loss: 0.0226 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 151/200
 - 1s - loss: 0.0227 - acc: 0.9931 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 152/200
 - 1s - loss: 0.0226 - acc: 0.9938 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 153/200
 - 1s - loss: 0.0229 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 154/200
 - 1s - loss: 0.0227 - acc: 0.9930 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 155/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 156/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 157/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 158/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 159/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 160/200
 - 1s - loss: 0.0221 - acc: 0.9941 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 161/200
 - 1s - loss: 0.0222 - acc: 0.9937 - val_loss: 0.0236 - val_acc: 0.9930
Epoch 162/200
 - 1s - loss: 0.0224 - acc: 0.9933 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 163/200
 - 1s - loss: 0.0224 - acc: 0.9934 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 164/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 165/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 166/200
 - 1s - loss: 0.0227 - acc: 0.9932 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 167/200
 - 1s - loss: 0.0219 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 168/200
 - 1s - loss: 0.0223 - acc: 0.9938 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 169/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 170/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 171/200
 - 1s - loss: 0.0224 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 172/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 173/200
 - 1s - loss: 0.0223 - acc: 0.9934 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 174/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 175/200
 - 1s - loss: 0.0222 - acc: 0.9931 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 176/200
 - 1s - loss: 0.0223 - acc: 0.9936 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 177/200
 - 1s - loss: 0.0222 - acc: 0.9934 - val_loss: 0.0235 - val_acc: 0.9930
Epoch 178/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 179/200
 - 1s - loss: 0.0227 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 180/200
 - 1s - loss: 0.0224 - acc: 0.9936 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 181/200
 - 1s - loss: 0.0221 - acc: 0.9935 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 182/200
 - 1s - loss: 0.0224 - acc: 0.9938 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 183/200
 - 1s - loss: 0.0220 - acc: 0.9938 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 184/200
 - 1s - loss: 0.0225 - acc: 0.9934 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 185/200
 - 1s - loss: 0.0226 - acc: 0.9934 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 186/200
 - 1s - loss: 0.0222 - acc: 0.9936 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 187/200
 - 1s - loss: 0.0223 - acc: 0.9933 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 188/200
 - 1s - loss: 0.0220 - acc: 0.9940 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 189/200
 - 1s - loss: 0.0223 - acc: 0.9931 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 190/200
 - 1s - loss: 0.0222 - acc: 0.9939 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 191/200
 - 1s - loss: 0.0221 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 192/200
 - 1s - loss: 0.0219 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 193/200
 - 1s - loss: 0.0223 - acc: 0.9935 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 194/200
 - 1s - loss: 0.0223 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 195/200
 - 1s - loss: 0.0222 - acc: 0.9937 - val_loss: 0.0234 - val_acc: 0.9930
Epoch 196/200
 - 1s - loss: 0.0218 - acc: 0.9938 - val_loss: 0.0233 - val_acc: 0.9930
Epoch 197/200
 - 1s - loss: 0.0226 - acc: 0.9935 - val_loss: 0.0233 - val_acc: 0.9930
Epoch 198/200
 - 1s - loss: 0.0220 - acc: 0.9932 - val_loss: 0.0233 - val_acc: 0.9930
Epoch 199/200
 - 1s - loss: 0.0220 - acc: 0.9935 - val_loss: 0.0233 - val_acc: 0.9930
Epoch 200/200
 - 1s - loss: 0.0224 - acc: 0.9931 - val_loss: 0.0233 - val_acc: 0.9930
2018-03-27 13:28:46,608 [INFO] Evaluate...
2018-03-27 13:28:52,324 [INFO] Done!
2018-03-27 13:28:52,331 [INFO] tpe_transform took 0.002483 seconds
2018-03-27 13:28:52,332 [INFO] TPE using 99/99 trials with best loss 0.011121
2018-03-27 13:28:52,340 [INFO] Load data...
load features from 「original_InceptionV3.h5」
load features from 「original_Xception.h5」
load features from 「original_ResNet50.h5」
model created
['loss', 'acc']
2018-03-27 13:28:54,189 [INFO] Fit...
Train on 20000 samples, validate on 5000 samples
Epoch 1/200
 - 9s - loss: 0.0541 - acc: 0.9791 - val_loss: 0.0235 - val_acc: 0.9920
Epoch 2/200
 - 1s - loss: 0.0267 - acc: 0.9909 - val_loss: 0.0218 - val_acc: 0.9938
Epoch 3/200
 - 1s - loss: 0.0244 - acc: 0.9913 - val_loss: 0.0213 - val_acc: 0.9924
Epoch 4/200
 - 1s - loss: 0.0232 - acc: 0.9919 - val_loss: 0.0208 - val_acc: 0.9940
Epoch 5/200
 - 1s - loss: 0.0234 - acc: 0.9922 - val_loss: 0.0207 - val_acc: 0.9934
Epoch 6/200
 - 1s - loss: 0.0223 - acc: 0.9924 - val_loss: 0.0205 - val_acc: 0.9940
Epoch 7/200
 - 1s - loss: 0.0224 - acc: 0.9922 - val_loss: 0.0204 - val_acc: 0.9932
Epoch 8/200
 - 1s - loss: 0.0216 - acc: 0.9923 - val_loss: 0.0202 - val_acc: 0.9940
Epoch 9/200
 - 1s - loss: 0.0221 - acc: 0.9931 - val_loss: 0.0201 - val_acc: 0.9938
Epoch 10/200
 - 1s - loss: 0.0212 - acc: 0.9927 - val_loss: 0.0200 - val_acc: 0.9940
Epoch 11/200
 - 1s - loss: 0.0206 - acc: 0.9929 - val_loss: 0.0200 - val_acc: 0.9940
Epoch 12/200
 - 1s - loss: 0.0207 - acc: 0.9931 - val_loss: 0.0199 - val_acc: 0.9940
Epoch 13/200
 - 1s - loss: 0.0217 - acc: 0.9929 - val_loss: 0.0198 - val_acc: 0.9938
Epoch 14/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0198 - val_acc: 0.9938
Epoch 15/200
 - 1s - loss: 0.0197 - acc: 0.9936 - val_loss: 0.0198 - val_acc: 0.9940
Epoch 16/200
 - 1s - loss: 0.0206 - acc: 0.9929 - val_loss: 0.0197 - val_acc: 0.9938
Epoch 17/200
 - 1s - loss: 0.0205 - acc: 0.9931 - val_loss: 0.0197 - val_acc: 0.9940
Epoch 18/200
 - 1s - loss: 0.0201 - acc: 0.9933 - val_loss: 0.0197 - val_acc: 0.9940
Epoch 19/200
 - 1s - loss: 0.0205 - acc: 0.9934 - val_loss: 0.0197 - val_acc: 0.9940
Epoch 20/200
 - 1s - loss: 0.0212 - acc: 0.9925 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 21/200
 - 1s - loss: 0.0203 - acc: 0.9929 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 22/200
 - 1s - loss: 0.0204 - acc: 0.9928 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 23/200
 - 1s - loss: 0.0201 - acc: 0.9935 - val_loss: 0.0196 - val_acc: 0.9940
Epoch 24/200
 - 1s - loss: 0.0191 - acc: 0.9939 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 25/200
 - 1s - loss: 0.0208 - acc: 0.9929 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 26/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 27/200
 - 1s - loss: 0.0193 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 28/200
 - 1s - loss: 0.0200 - acc: 0.9932 - val_loss: 0.0195 - val_acc: 0.9940
Epoch 29/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0194 - val_acc: 0.9938
Epoch 30/200
 - 1s - loss: 0.0204 - acc: 0.9927 - val_loss: 0.0194 - val_acc: 0.9938
Epoch 31/200
 - 1s - loss: 0.0218 - acc: 0.9928 - val_loss: 0.0194 - val_acc: 0.9938
Epoch 32/200
 - 1s - loss: 0.0202 - acc: 0.9937 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 33/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 34/200
 - 1s - loss: 0.0199 - acc: 0.9932 - val_loss: 0.0194 - val_acc: 0.9940
Epoch 35/200
 - 1s - loss: 0.0203 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 36/200
 - 1s - loss: 0.0200 - acc: 0.9935 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 37/200
 - 1s - loss: 0.0197 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 38/200
 - 1s - loss: 0.0202 - acc: 0.9932 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 39/200
 - 1s - loss: 0.0203 - acc: 0.9933 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 40/200
 - 1s - loss: 0.0193 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 41/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 42/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 43/200
 - 1s - loss: 0.0197 - acc: 0.9931 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 44/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0193 - val_acc: 0.9940
Epoch 45/200
 - 1s - loss: 0.0185 - acc: 0.9945 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 46/200
 - 1s - loss: 0.0192 - acc: 0.9941 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 47/200
 - 1s - loss: 0.0193 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 48/200
 - 1s - loss: 0.0205 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 49/200
 - 1s - loss: 0.0190 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 50/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 51/200
 - 1s - loss: 0.0178 - acc: 0.9946 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 52/200
 - 1s - loss: 0.0190 - acc: 0.9938 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 53/200
 - 1s - loss: 0.0190 - acc: 0.9935 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 54/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 55/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 56/200
 - 1s - loss: 0.0194 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 57/200
 - 1s - loss: 0.0182 - acc: 0.9937 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 58/200
 - 1s - loss: 0.0182 - acc: 0.9944 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 59/200
 - 1s - loss: 0.0194 - acc: 0.9934 - val_loss: 0.0192 - val_acc: 0.9940
Epoch 60/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 61/200
 - 1s - loss: 0.0188 - acc: 0.9935 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 62/200
 - 1s - loss: 0.0189 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 63/200
 - 1s - loss: 0.0189 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 64/200
 - 1s - loss: 0.0185 - acc: 0.9941 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 65/200
 - 1s - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 66/200
 - 1s - loss: 0.0178 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 67/200
 - 1s - loss: 0.0174 - acc: 0.9946 - val_loss: 0.0191 - val_acc: 0.9940
Epoch 68/200
 - 1s - loss: 0.0191 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9942
Epoch 69/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 70/200
 - 1s - loss: 0.0182 - acc: 0.9936 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 71/200
 - 1s - loss: 0.0185 - acc: 0.9940 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 72/200
 - 1s - loss: 0.0190 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 73/200
 - 1s - loss: 0.0179 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 74/200
 - 1s - loss: 0.0184 - acc: 0.9939 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 75/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0191 - val_acc: 0.9944
Epoch 76/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 77/200
 - 1s - loss: 0.0184 - acc: 0.9934 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 78/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 79/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 80/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 81/200
 - 1s - loss: 0.0177 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 82/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 83/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 84/200
 - 1s - loss: 0.0181 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 85/200
 - 1s - loss: 0.0196 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 86/200
 - 1s - loss: 0.0187 - acc: 0.9933 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 87/200
 - 1s - loss: 0.0189 - acc: 0.9939 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 88/200
 - 1s - loss: 0.0189 - acc: 0.9937 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 89/200
 - 1s - loss: 0.0191 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 90/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 91/200
 - 1s - loss: 0.0176 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 92/200
 - 1s - loss: 0.0195 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 93/200
 - 1s - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 94/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 95/200
 - 1s - loss: 0.0190 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 96/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 97/200
 - 1s - loss: 0.0189 - acc: 0.9935 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 98/200
 - 1s - loss: 0.0187 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 99/200
 - 1s - loss: 0.0180 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 100/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 101/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 102/200
 - 1s - loss: 0.0180 - acc: 0.9936 - val_loss: 0.0190 - val_acc: 0.9944
Epoch 103/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 104/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 105/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0190 - val_acc: 0.9946
Epoch 106/200
 - 1s - loss: 0.0184 - acc: 0.9943 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 107/200
 - 1s - loss: 0.0178 - acc: 0.9944 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 108/200
 - 1s - loss: 0.0181 - acc: 0.9942 - val_loss: 0.0190 - val_acc: 0.9942
Epoch 109/200
 - 1s - loss: 0.0180 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 110/200
 - 1s - loss: 0.0187 - acc: 0.9939 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 111/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 112/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 113/200
 - 1s - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9942
Epoch 114/200
 - 1s - loss: 0.0186 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 115/200
 - 1s - loss: 0.0188 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 116/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 117/200
 - 1s - loss: 0.0192 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 118/200
 - 1s - loss: 0.0181 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 119/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 120/200
 - 1s - loss: 0.0183 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 121/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 122/200
 - 1s - loss: 0.0182 - acc: 0.9935 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 123/200
 - 1s - loss: 0.0184 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 124/200
 - 1s - loss: 0.0172 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 125/200
 - 1s - loss: 0.0186 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 126/200
 - 1s - loss: 0.0180 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 127/200
 - 1s - loss: 0.0170 - acc: 0.9950 - val_loss: 0.0189 - val_acc: 0.9944
Epoch 128/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 129/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 130/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 131/200
 - 1s - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 132/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 133/200
 - 1s - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 134/200
 - 1s - loss: 0.0184 - acc: 0.9936 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 135/200
 - 1s - loss: 0.0179 - acc: 0.9939 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 136/200
 - 1s - loss: 0.0185 - acc: 0.9938 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 137/200
 - 1s - loss: 0.0179 - acc: 0.9933 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 138/200
 - 1s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 139/200
 - 1s - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 140/200
 - 1s - loss: 0.0197 - acc: 0.9934 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 141/200
 - 1s - loss: 0.0178 - acc: 0.9947 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 142/200
 - 1s - loss: 0.0174 - acc: 0.9945 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 143/200
 - 1s - loss: 0.0191 - acc: 0.9937 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 144/200
 - 1s - loss: 0.0187 - acc: 0.9940 - val_loss: 0.0189 - val_acc: 0.9946
Epoch 145/200
 - 1s - loss: 0.0177 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 146/200
 - 1s - loss: 0.0179 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 147/200
 - 1s - loss: 0.0177 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 148/200
 - 1s - loss: 0.0184 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 149/200
 - 1s - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 150/200
 - 1s - loss: 0.0184 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 151/200
 - 1s - loss: 0.0183 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 152/200
 - 1s - loss: 0.0180 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 153/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 154/200
 - 1s - loss: 0.0181 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 155/200
 - 1s - loss: 0.0190 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 156/200
 - 1s - loss: 0.0172 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 157/200
 - 1s - loss: 0.0184 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 158/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 159/200
 - 1s - loss: 0.0181 - acc: 0.9933 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 160/200
 - 1s - loss: 0.0184 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 161/200
 - 1s - loss: 0.0182 - acc: 0.9936 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 162/200
 - 1s - loss: 0.0173 - acc: 0.9945 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 163/200
 - 1s - loss: 0.0186 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 164/200
 - 1s - loss: 0.0179 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 165/200
 - 1s - loss: 0.0170 - acc: 0.9943 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 166/200
 - 1s - loss: 0.0176 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 167/200
 - 1s - loss: 0.0185 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 168/200
 - 1s - loss: 0.0197 - acc: 0.9935 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 169/200
 - 1s - loss: 0.0170 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 170/200
 - 1s - loss: 0.0183 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 171/200
 - 1s - loss: 0.0182 - acc: 0.9936 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 172/200
 - 1s - loss: 0.0190 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 173/200
 - 1s - loss: 0.0174 - acc: 0.9943 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 174/200
 - 1s - loss: 0.0185 - acc: 0.9931 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 175/200
 - 1s - loss: 0.0181 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 176/200
 - 1s - loss: 0.0176 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 177/200
 - 1s - loss: 0.0178 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 178/200
 - 1s - loss: 0.0182 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 179/200
 - 1s - loss: 0.0174 - acc: 0.9949 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 180/200
 - 1s - loss: 0.0169 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 181/200
 - 1s - loss: 0.0171 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 182/200
 - 1s - loss: 0.0175 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 183/200
 - 1s - loss: 0.0183 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 184/200
 - 1s - loss: 0.0181 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 185/200
 - 1s - loss: 0.0177 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 186/200
 - 1s - loss: 0.0181 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 187/200
 - 1s - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 188/200
 - 1s - loss: 0.0176 - acc: 0.9946 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 189/200
 - 1s - loss: 0.0179 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 190/200
 - 1s - loss: 0.0181 - acc: 0.9943 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 191/200
 - 1s - loss: 0.0180 - acc: 0.9941 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 192/200
 - 1s - loss: 0.0174 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 193/200
 - 1s - loss: 0.0178 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 194/200
 - 1s - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 195/200
 - 1s - loss: 0.0179 - acc: 0.9938 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 196/200
 - 1s - loss: 0.0177 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 197/200
 - 1s - loss: 0.0175 - acc: 0.9942 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 198/200
 - 1s - loss: 0.0172 - acc: 0.9944 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 199/200
 - 1s - loss: 0.0179 - acc: 0.9937 - val_loss: 0.0188 - val_acc: 0.9946
Epoch 200/200
 - 1s - loss: 0.0179 - acc: 0.9940 - val_loss: 0.0188 - val_acc: 0.9946
2018-03-27 13:32:25,486 [INFO] Evaluate...
2018-03-27 13:32:31,229 [INFO] Done!
In [23]:
datafile = open("trial.pckl","wb")
pickle.dump(trials,datafile)
datafile.close()
print("Done")
Done

测试HyperOpt优化参数后训练的模型

In [2]:
datafile = open("trial.pckl","rb")
trial = pickle.load(datafile)
datafile.close()
In [3]:
trial_sorted = sorted(trial.results, key=lambda t: t["eval"]["loss"]) 
In [4]:
path = trial_sorted[0]["path"]["model"]
best_model = load_model(path)
In [5]:
Xs = load_test_data_merge(["InceptionV3","Xception","ResNet50"])
load test data from 「original_InceptionV3.h5」
load test data from 「original_Xception.h5」
load test data from 「original_ResNet50.h5」
In [6]:
pred = test_to_csv(best_model,Xs,)
12500/12500 [==============================] - 1s 41us/step
Found 12500 images belonging to 1 classes.
   id  label
0   1  0.995
1   2  0.995
2   3  0.995
3   4  0.995
4   5  0.005
5   6  0.005
6   7  0.005
7   8  0.005
8   9  0.005
9  10  0.005
CSV文件已保存至csv_output/pred.csv

迁移学习 - Fine-tuning - 切分模型

定义切分模型函数

把预训练的模型MODEL切分为untrainable_model和trainable_model。切分点需为最后一个block(通过n_trainable_layer设置),前者用来导出特征向量,方便今后直接读取以加快训练速度。后者用来衔接输出层一起训练。

切分模型

将两个切分模型保存至文件"model/XXXX_feature.h5"和"model/XXXX_trainable.h5"

In [11]:
#预训练模型最后block的切分点:
# ResNet50 : 11
# InceptionV3: 31
# Xception : 6

ResNet50_feature_model,ResNet50_trainable_model = split_model(ResNet50,(224,224),11)
InceptionV3_feature_model,InceptionV3_trainable_model = split_model(InceptionV3,(299,299),31)
Xception_feature_model,Xception_trainable_model = split_model(Xception,(299,299),6)
Saving model:InceptionV3
Done
Saving model:Xception
Done

【Checkpoint】读取切分的模型-函数

提取特征

In [ ]:
from keras.applications import *
from keras.preprocessing.image import *

extract_feature_finetuning("ResNet50",(224,224),resnet50.preprocess_input)
extract_feature_finetuning("InceptionV3",(299,299),inception_v3.preprocess_input)
extract_feature_finetuning("Xception",(299,299),xception.preprocess_input)

【Checkpoint】载入Fine-tuning特征向量

训练Fine-tuning模型

In [2]:
# 由于特征文件较大,读取需要一定时间
model_name = "ResNet50"
X_train, y_train = load_train_data(model_name,dataset="fine",need_shuffle=True)
print("Done")
load features from 「fine_ResNet50.h5」
Done
In [3]:
learn_rates=[0.001]
momentums = [0.9]
decays = [0]
max_epochs = 40
earlystop_patience = 8

# learn_rates=[0.001,0.01]
# momentums = [0.0]
# decays = [0.0]


#因为需要用Tensorboard 作为 callback记录数据,而在Sklearn的Gridsearch中不支持callback,所以这里手动循环参数,用tensorboard记录并寻找最佳的参数组合。
best_model = None
history_minloss = 100
for learning_rate in learn_rates:
    for momentum in momentums:
        for decay in decays:
            
            sgd = optimizers.SGD(lr=learning_rate, decay=decay, momentum=momentum,nesterov=True)

            tensorboard_directory = "logs/fine/%s_lr%s_decay%s_momentum%s_nesterov_loadweights"%(model_name,learning_rate,decay,momentum)
            del_file_if_exist(tensorboard_directory)
            tensorboard = TensorBoard(tensorboard_directory,batch_size=BATCH_SIZE)
            earlyStop = EarlyStopping(monitor='val_loss', patience=earlystop_patience, verbose=1)
            ModelCheckpoint("check_point_model.h5", monitor='val_loss', save_best_only=True, verbose=0)
            
            callbacks=[tensorboard,earlyStop]
            
            # i.e. models["VGG"]=(model,history)

            print("\nStart Fitting Model with lr:%s ,mom:%s , decay:%s"%(learning_rate,momentum,decay))
            model,history = fit_model_fine_output(model_name,X_train,y_train,
                                       epochs=max_epochs,
                                       optimizer=sgd,
                                       callbacks=callbacks,
                                       auto_save=False)
            
            last_val_loss = history.history['val_loss'][-1]
            if history_minloss>last_val_loss:
                best_model = model
                history_minloss = last_val_loss
                print("The model is better than ever!")
                best_model.save("model/%s" % get_filename(model_name, "fine", "output"))
            else:
                print("The model is not good enough")
Start Fitting Model with lr:0.001 ,mom:0.9 , decay:0
/home/ubuntu/anaconda3/envs/tensorflow_p36/lib/python3.6/site-packages/keras/models.py:255: UserWarning: No training configuration found in save file: the model was *not* compiled. Compile it manually.
  warnings.warn('No training configuration found in save file: '
model created
ResNet50 pretrained output weights applyed to new model:[array([[ 0.01401794],
       [ 0.03008838],
       [-0.00407078],
       ...,
       [ 0.01855631],
       [-0.04840425],
       [ 0.12140015]], dtype=float32), array([0.0554946], dtype=float32)]
Train on 20000 samples, validate on 5000 samples
Epoch 1/40
20000/20000 [==============================] - 19s 962us/step - loss: 0.0372 - acc: 0.9864 - val_loss: 0.0328 - val_acc: 0.9874
Epoch 2/40
20000/20000 [==============================] - 18s 881us/step - loss: 0.0297 - acc: 0.9894 - val_loss: 0.0309 - val_acc: 0.9886
Epoch 3/40
20000/20000 [==============================] - 18s 887us/step - loss: 0.0274 - acc: 0.9899 - val_loss: 0.0308 - val_acc: 0.9886
Epoch 4/40
20000/20000 [==============================] - 18s 889us/step - loss: 0.0229 - acc: 0.9920 - val_loss: 0.0328 - val_acc: 0.9876
Epoch 5/40
20000/20000 [==============================] - 18s 887us/step - loss: 0.0221 - acc: 0.9922 - val_loss: 0.0305 - val_acc: 0.9882
Epoch 6/40
20000/20000 [==============================] - 18s 889us/step - loss: 0.0176 - acc: 0.9941 - val_loss: 0.0300 - val_acc: 0.9886
Epoch 7/40
20000/20000 [==============================] - 18s 889us/step - loss: 0.0168 - acc: 0.9940 - val_loss: 0.0294 - val_acc: 0.9886
Epoch 8/40
20000/20000 [==============================] - 18s 888us/step - loss: 0.0153 - acc: 0.9945 - val_loss: 0.0298 - val_acc: 0.9896
Epoch 9/40
20000/20000 [==============================] - 18s 888us/step - loss: 0.0151 - acc: 0.9948 - val_loss: 0.0298 - val_acc: 0.9890
Epoch 10/40
20000/20000 [==============================] - 18s 890us/step - loss: 0.0141 - acc: 0.9954 - val_loss: 0.0306 - val_acc: 0.9884
Epoch 11/40
20000/20000 [==============================] - 18s 888us/step - loss: 0.0118 - acc: 0.9963 - val_loss: 0.0295 - val_acc: 0.9896
Epoch 00011: early stopping
The model is better than ever!

测试模型并导出CSV

In [4]:
# for model_name in ["ResNet50","InceptionV3","Xception"]:
for model_name in ["ResNet50"]:
    model = load_model_fine_output(model_name)
    X_test = load_test_data(model_name,dataset="fine")
    test_to_csv(model,X_test,get_filename(model_name,"fine",ext="csv")) 
load test data from 「fine_ResNet50.h5」
12500/12500 [==============================] - 6s 456us/step
Found 12500 images belonging to 1 classes.
   id  label
0   1  0.995
1   2  0.995
2   3  0.995
3   4  0.995
4   5  0.005
5   6  0.005
6   7  0.005
7   8  0.005
8   9  0.005
9  10  0.005
CSV文件已保存至csv_output/fine_ResNet50.csv
/home/ubuntu/cats_dogs/helper.py:473: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
  df.set_value(index - 1, 'label', predict[i])
In [25]:
SVG(model_to_dot(final_model,show_shapes=True).create(prog='dot', format='svg'))
Out[25]:
G 5687398864 new_input: InputLayer input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 5687399144 res5c_branch2a: Conv2D input: output: (None, 7, 7, 2048) (None, 7, 7, 512) 5687398864->5687399144 5687401776 add_424: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 5687398864->5687401776 5687399480 bn5c_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 5687399144->5687399480 5687400096 activation_1452: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 5687399480->5687400096 5687400376 res5c_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 5687400096->5687400376 5687400656 bn5c_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 5687400376->5687400656 5687401048 activation_1453: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 5687400656->5687401048 5687400936 res5c_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 5687401048->5687400936 5687401328 bn5c_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 5687400936->5687401328 5687401328->5687401776 5687401832 activation_1454: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 5687401776->5687401832 5687402056 avg_pool: AveragePooling2D input: output: (None, 7, 7, 2048) (None, 1, 1, 2048) 5687401832->5687402056 5687401944 customAVG: GlobalAveragePooling2D input: output: (None, 1, 1, 2048) (None, 2048) 5687402056->5687401944 5158737624 dropout_2: Dropout input: output: (None, 2048) (None, 2048) 5687401944->5158737624 5158737176 dense_2: Dense input: output: (None, 2048) (None, 1) 5158737624->5158737176

迁移学习 - Finetuning - 锁层

In [2]:
# from helper import *
# import imp
# imp.reload(helper)
from helper import *
from keras.callbacks import *
In [2]:
def run_finetuning_model(MODEL, image_size, n_trainable_layer, preprocess_func=None, epochs_max=20,augmented=False,apply_pretrain_weights=False,optimizer="adadelta",dropout=True):
    # Create Model
#     optimizer = optimizers.SGD(lr=0.001, decay=0, momentum=0.9,nesterov=True)
#     optimizer = optimizers.SGD()
#     optimizer = "adadelta"
    final_model = create_model_fine_whole(MODEL, image_size, 
                                          n_trainable_layer,
                                          optimizer=optimizer,
                                         apply_pretrain_weights=apply_pretrain_weights,
                                         dropout=dropout)
    # Fit Model
    final_model,history = fit_model_fine_whole(final_model, MODEL.__name__, image_size,
                                   n_trainable_layer=n_trainable_layer,
                                   preprocess_func=preprocess_func,
                                   epochs_max=epochs_max,
                                   auto_save=True,
                                    augmented=augmented)

    # Test Model and output Predicts
    print("Testing Model...")
    
    suffix = "loadweight_" if apply_pretrain_weights else ""
    suffix += "lastopen{}".format(n_trainable_layer)
    if augmented:
        suffix += "_augmented"
    test_generator_to_csv(final_model, image_size, csv_filename=get_filename(MODEL.__name__, "fine", suffix,ext="csv"))
    return final_model, history

第一阶段-快速收敛输出层权重

In [10]:
from keras.optimizers import *
final_model = create_model_fine_whole(ResNet50, (224,224), 
                                          n_trainable_layer = 0,
                                          optimizer="adadelta",
                                         apply_pretrain_weights=False)
Creating Model...
model created
In [11]:
SVG(model_to_dot(final_model,show_shapes=True).create(prog='dot', format='svg'))
Out[11]:
G 140108744063576 input_3: InputLayer input: output: (None, 224, 224, 3) (None, 224, 224, 3) 140108744064136 conv1: Conv2D input: output: (None, 224, 224, 3) (None, 112, 112, 64) 140108744063576->140108744064136 140108744064640 bn_conv1: BatchNormalization input: output: (None, 112, 112, 64) (None, 112, 112, 64) 140108744064136->140108744064640 140108732688144 activation_99: Activation input: output: (None, 112, 112, 64) (None, 112, 112, 64) 140108744064640->140108732688144 140108706972952 max_pooling2d_3: MaxPooling2D input: output: (None, 112, 112, 64) (None, 55, 55, 64) 140108732688144->140108706972952 140108706616824 res2a_branch2a: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108706972952->140108706616824 140108703951784 res2a_branch1: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 140108706972952->140108703951784 140108706052472 bn2a_branch2a: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108706616824->140108706052472 140108705815576 activation_100: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108706052472->140108705815576 140108705910288 res2a_branch2b: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108705815576->140108705910288 140108705332864 bn2a_branch2b: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108705910288->140108705332864 140108704964960 activation_101: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108705332864->140108704964960 140108705160552 res2a_branch2c: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 140108704964960->140108705160552 140108704824680 bn2a_branch2c: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108705160552->140108704824680 140108703724992 bn2a_branch1: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108703951784->140108703724992 140108703195376 add_33: Add input: output: [(None, 55, 55, 256), (None, 55, 55, 256)] (None, 55, 55, 256) 140108704824680->140108703195376 140108703724992->140108703195376 140108702871792 activation_102: Activation input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108703195376->140108702871792 140108703054536 res2b_branch2a: Conv2D input: output: (None, 55, 55, 256) (None, 55, 55, 64) 140108702871792->140108703054536 140108687978224 add_34: Add input: output: [(None, 55, 55, 256), (None, 55, 55, 256)] (None, 55, 55, 256) 140108702871792->140108687978224 140108702488728 bn2b_branch2a: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108703054536->140108702488728 140108702180576 activation_103: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108702488728->140108702180576 140108693119048 res2b_branch2b: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108702180576->140108693119048 140108702273432 bn2b_branch2b: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108693119048->140108702273432 140108688681784 activation_104: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108702273432->140108688681784 140108688464360 res2b_branch2c: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 140108688681784->140108688464360 140108688365384 bn2b_branch2c: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108688464360->140108688365384 140108688365384->140108687978224 140108687267376 activation_105: Activation input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108687978224->140108687267376 140108687468696 res2c_branch2a: Conv2D input: output: (None, 55, 55, 256) (None, 55, 55, 64) 140108687267376->140108687468696 140108684614848 add_35: Add input: output: [(None, 55, 55, 256), (None, 55, 55, 256)] (None, 55, 55, 256) 140108687267376->140108684614848 140108686923368 bn2c_branch2a: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108687468696->140108686923368 140108686609376 activation_106: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108686923368->140108686609376 140108686173912 res2c_branch2b: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108686609376->140108686173912 140108686386904 bn2c_branch2b: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108686173912->140108686386904 140108685850888 activation_107: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 140108686386904->140108685850888 140108685514008 res2c_branch2c: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 140108685850888->140108685514008 140108685168592 bn2c_branch2c: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108685514008->140108685168592 140108685168592->140108684614848 140108684418968 activation_108: Activation input: output: (None, 55, 55, 256) (None, 55, 55, 256) 140108684614848->140108684418968 140108684101896 res3a_branch2a: Conv2D input: output: (None, 55, 55, 256) (None, 28, 28, 128) 140108684418968->140108684101896 140108707425808 res3a_branch1: Conv2D input: output: (None, 55, 55, 256) (None, 28, 28, 512) 140108684418968->140108707425808 140108684181576 bn3a_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108684101896->140108684181576 140108683316304 activation_109: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108684181576->140108683316304 140108683414832 res3a_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108683316304->140108683414832 140108682829496 bn3a_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108683414832->140108682829496 140108682983128 activation_110: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108682829496->140108682983128 140108735078240 res3a_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 140108682983128->140108735078240 140108682653992 bn3a_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108735078240->140108682653992 140108707722576 bn3a_branch1: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108707425808->140108707722576 140108707564960 add_36: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 140108682653992->140108707564960 140108707722576->140108707564960 140108707998352 activation_111: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108707564960->140108707998352 140108707822728 res3b_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 28, 28, 128) 140108707998352->140108707822728 140108708990816 add_37: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 140108707998352->140108708990816 140108708145024 bn3b_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108707822728->140108708145024 140108708219424 activation_112: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108708145024->140108708219424 140108708090712 res3b_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108708219424->140108708090712 140108708540824 bn3b_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108708090712->140108708540824 140108708323400 activation_113: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108708540824->140108708323400 140108708713472 res3b_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 140108708323400->140108708713472 140108708641200 bn3b_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108708713472->140108708641200 140108708641200->140108708990816 140108708850432 activation_114: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108708990816->140108708850432 140108709299704 res3c_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 28, 28, 128) 140108708850432->140108709299704 140108709892448 add_38: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 140108708850432->140108709892448 140108709130872 bn3c_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108709299704->140108709130872 140108709569536 activation_115: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108709130872->140108709569536 140108709479592 res3c_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108709569536->140108709479592 140108709530592 bn3c_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108709479592->140108709530592 140108709669016 activation_116: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108709530592->140108709669016 140108710144040 res3c_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 140108709669016->140108710144040 140108710056904 bn3c_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108710144040->140108710056904 140108710056904->140108709892448 140108710207160 activation_117: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108709892448->140108710207160 140108710251320 res3d_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 28, 28, 128) 140108710207160->140108710251320 140108711241208 add_39: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 140108710207160->140108711241208 140108710517280 bn3d_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108710251320->140108710517280 140108710429640 activation_118: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108710517280->140108710429640 140108710914256 res3d_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108710429640->140108710914256 140108710823080 bn3d_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108710914256->140108710823080 140108711182856 activation_119: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 140108710823080->140108711182856 140108711053128 res3d_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 140108711182856->140108711053128 140108710965432 bn3d_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108711053128->140108710965432 140108710965432->140108711241208 140108711724816 activation_120: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 140108711241208->140108711724816 140108711548744 res4a_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 14, 14, 256) 140108711724816->140108711548744 140108712714368 res4a_branch1: Conv2D input: output: (None, 28, 28, 512) (None, 14, 14, 1024) 140108711724816->140108712714368 140108711946056 bn4a_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108711548744->140108711946056 140108711856224 activation_121: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108711946056->140108711856224 140108711814032 res4a_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108711856224->140108711814032 140108711727456 bn4a_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108711814032->140108711727456 140108712035272 activation_122: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108711727456->140108712035272 140108712440776 res4a_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 140108712035272->140108712440776 140108712487120 bn4a_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108712440776->140108712487120 140108712932576 bn4a_branch1: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108712714368->140108712932576 140108712854304 add_40: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 140108712487120->140108712854304 140108712932576->140108712854304 140108713253128 activation_123: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108712854304->140108713253128 140108713161168 res4b_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 140108713253128->140108713161168 140108714235552 add_41: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 140108713253128->140108714235552 140108713160776 bn4b_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108713161168->140108713160776 140108713391384 activation_124: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108713160776->140108713391384 140108713343352 res4b_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108713391384->140108713343352 140108713784768 bn4b_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108713343352->140108713784768 140108713608024 activation_125: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108713784768->140108713608024 140108714017456 res4b_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 140108713608024->140108714017456 140108713926048 bn4b_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108714017456->140108713926048 140108713926048->140108714235552 140108714149928 activation_126: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108714235552->140108714149928 140108714515536 res4c_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 140108714149928->140108714515536 140108715143632 add_42: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 140108714149928->140108715143632 140108714379304 bn4c_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108714515536->140108714379304 140108714818080 activation_127: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108714379304->140108714818080 140108714779480 res4c_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108714818080->140108714779480 140108714692112 bn4c_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108714779480->140108714692112 140108715006160 activation_128: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108714692112->140108715006160 140108715393936 res4c_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 140108715006160->140108715393936 140108715320152 bn4c_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108715393936->140108715320152 140108715320152->140108715143632 140108715540944 activation_129: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108715143632->140108715540944 140108715451616 res4d_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 140108715540944->140108715451616 140108716573024 add_43: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 140108715540944->140108716573024 140108715811392 bn4d_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108715451616->140108715811392 140108715724136 activation_130: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108715811392->140108715724136 140108716156296 res4d_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108715724136->140108716156296 140108715682168 bn4d_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108716156296->140108715682168 140108716350768 activation_131: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108715682168->140108716350768 140108716309808 res4d_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 140108716350768->140108716309808 140108716212576 bn4d_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108716309808->140108716212576 140108716212576->140108716573024 140108716965840 activation_132: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108716573024->140108716965840 140108716889256 res4e_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 140108716965840->140108716889256 140108717966896 add_44: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 140108716965840->140108717966896 140108717202960 bn4e_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108716889256->140108717202960 140108717111448 activation_133: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108717202960->140108717111448 140108717068920 res4e_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108717111448->140108717068920 140108716982680 bn4e_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108717068920->140108716982680 140108717338408 activation_134: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108716982680->140108717338408 140108717724896 res4e_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 140108717338408->140108717724896 140108717641912 bn4e_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108717724896->140108717641912 140108717641912->140108717966896 140108717881440 activation_135: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108717966896->140108717881440 140108718225000 res4f_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 140108717881440->140108718225000 140108727267456 add_45: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 140108717881440->140108727267456 140108718106216 bn4f_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108718225000->140108718106216 140108718024856 activation_136: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108718106216->140108718024856 140108718502520 res4f_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108718024856->140108718502520 140108718408032 bn4f_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108718502520->140108718408032 140108718727520 activation_137: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 140108718408032->140108718727520 140108718605256 res4f_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 140108718727520->140108718605256 140108718648400 bn4f_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108718605256->140108718648400 140108718648400->140108727267456 140108727671608 activation_138: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 140108727267456->140108727671608 140108727580096 res5a_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 7, 7, 512) 140108727671608->140108727580096 140108728697128 res5a_branch1: Conv2D input: output: (None, 14, 14, 1024) (None, 7, 7, 2048) 140108727671608->140108728697128 140108727935552 bn5a_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108727580096->140108727935552 140108727841680 activation_139: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108727935552->140108727841680 140108727772272 res5a_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108727841680->140108727772272 140108728203864 bn5a_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108727772272->140108728203864 140108728472296 activation_140: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108728203864->140108728472296 140108728423256 res5a_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 140108728472296->140108728423256 140108728341336 bn5a_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108728423256->140108728341336 140108728956520 bn5a_branch1: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108728697128->140108728956520 140108728794024 add_46: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 140108728341336->140108728794024 140108728956520->140108728794024 140108729235552 activation_141: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108728794024->140108729235552 140108729105656 res5b_branch2a: Conv2D input: output: (None, 7, 7, 2048) (None, 7, 7, 512) 140108729235552->140108729105656 140108730225104 add_47: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 140108729235552->140108730225104 140108729464984 bn5b_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108729105656->140108729464984 140108729420264 activation_142: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108729464984->140108729420264 140108729369320 res5b_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108729420264->140108729369320 140108729765960 bn5b_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108729369320->140108729765960 140108729566880 activation_143: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108729765960->140108729566880 140108729951512 res5b_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 140108729566880->140108729951512 140108729879296 bn5b_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108729951512->140108729879296 140108729879296->140108730225104 140108730103120 activation_144: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108730225104->140108730103120 140108730537688 res5c_branch2a: Conv2D input: output: (None, 7, 7, 2048) (None, 7, 7, 512) 140108730103120->140108730537688 140108731646416 add_48: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 140108730103120->140108731646416 140108730407624 bn5c_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108730537688->140108730407624 140108730811392 activation_145: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108730407624->140108730811392 140108730683176 res5c_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108730811392->140108730683176 140108730772672 bn5c_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108730683176->140108730772672 140108730903016 activation_146: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 140108730772672->140108730903016 140108731392128 res5c_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 140108730903016->140108731392128 140108731303136 bn5c_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108731392128->140108731303136 140108731303136->140108731646416 140108731533352 activation_147: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 140108731646416->140108731533352 140108732690384 avg_pool: AveragePooling2D input: output: (None, 7, 7, 2048) (None, 1, 1, 2048) 140108731533352->140108732690384 140108731439200 global_average_pooling2d_3: GlobalAveragePooling2D input: output: (None, 1, 1, 2048) (None, 2048) 140108732690384->140108731439200 140108731805480 dropout_3: Dropout input: output: (None, 2048) (None, 2048) 140108731439200->140108731805480 140108731805312 dense_3: Dense input: output: (None, 2048) (None, 1) 140108731805480->140108731805312
In [13]:
fit_model_fine_whole(final_model, "ResNet50", (224,224),
                                   n_trainable_layer=0,
                                   preprocess_func=resnet50.preprocess_input,
                                   epochs_max=5,
                                   auto_save=True,
                                    augmented=False)
Found 20000 images belonging to 2 classes.
Found 5000 images belonging to 2 classes.
Fitting Model...
Epoch 1/5
157/157 [==============================] - 194s 1s/step - loss: 0.1893 - acc: 0.9237 - val_loss: 0.0583 - val_acc: 0.9798
Epoch 2/5
157/157 [==============================] - 193s 1s/step - loss: 0.0848 - acc: 0.9690 - val_loss: 0.0476 - val_acc: 0.9832
Epoch 3/5
157/157 [==============================] - 193s 1s/step - loss: 0.0711 - acc: 0.9731 - val_loss: 0.0589 - val_acc: 0.9782
Epoch 4/5
157/157 [==============================] - 194s 1s/step - loss: 0.0665 - acc: 0.9747 - val_loss: 0.0445 - val_acc: 0.9854
Epoch 5/5
157/157 [==============================] - 194s 1s/step - loss: 0.0672 - acc: 0.9758 - val_loss: 0.0514 - val_acc: 0.9830
Out[13]:
(<keras.engine.training.Model at 0x7f6d9b2f0828>,
 <keras.callbacks.History at 0x7f6d89308198>)

第二阶段-FineTuning最后的block和输出层

In [31]:
n_trainable_fromlastlayer = 8

untrainable_num = len(final_model.layers)-n_trainable_fromlastlayer
for i, layer in enumerate(final_model.layers):
    layer.trainable = i>=untrainable_num

sgd = optimizers.SGD(lr=0.0001, decay=0, momentum=0.9)
final_model.compile(optimizer=sgd,
                   loss='binary_crossentropy',
                   metrics=['accuracy'])
In [34]:
history=fit_model_fine_whole(final_model, "ResNet50", (224,224),
                                   n_trainable_layer=5,
                                   preprocess_func=resnet50.preprocess_input,
                                   epochs_max=50,
                                   auto_save=True,
                                    augmented=False)
Found 20000 images belonging to 2 classes.
Found 5000 images belonging to 2 classes.
Fitting Model...
Epoch 1/50
157/157 [==============================] - 198s 1s/step - loss: 0.0593 - acc: 0.9790 - val_loss: 0.0541 - val_acc: 0.9808
Epoch 2/50
157/157 [==============================] - 196s 1s/step - loss: 0.0594 - acc: 0.9778 - val_loss: 0.0553 - val_acc: 0.9802
Epoch 3/50
157/157 [==============================] - 197s 1s/step - loss: 0.0590 - acc: 0.9779 - val_loss: 0.0535 - val_acc: 0.9810
Epoch 4/50
157/157 [==============================] - 197s 1s/step - loss: 0.0589 - acc: 0.9777 - val_loss: 0.0539 - val_acc: 0.9810
Epoch 5/50
157/157 [==============================] - 196s 1s/step - loss: 0.0573 - acc: 0.9788 - val_loss: 0.0540 - val_acc: 0.9812
Epoch 6/50
157/157 [==============================] - 196s 1s/step - loss: 0.0559 - acc: 0.9783 - val_loss: 0.0540 - val_acc: 0.9810
Epoch 7/50
157/157 [==============================] - 197s 1s/step - loss: 0.0557 - acc: 0.9793 - val_loss: 0.0548 - val_acc: 0.9802
Epoch 8/50
157/157 [==============================] - 197s 1s/step - loss: 0.0579 - acc: 0.9777 - val_loss: 0.0553 - val_acc: 0.9798
Epoch 9/50
157/157 [==============================] - 197s 1s/step - loss: 0.0578 - acc: 0.9779 - val_loss: 0.0546 - val_acc: 0.9806
Epoch 10/50
157/157 [==============================] - 197s 1s/step - loss: 0.0585 - acc: 0.9774 - val_loss: 0.0538 - val_acc: 0.9812
Epoch 11/50
157/157 [==============================] - 197s 1s/step - loss: 0.0567 - acc: 0.9791 - val_loss: 0.0546 - val_acc: 0.9804
Epoch 12/50
157/157 [==============================] - 197s 1s/step - loss: 0.0550 - acc: 0.9801 - val_loss: 0.0542 - val_acc: 0.9804
Epoch 13/50
157/157 [==============================] - 198s 1s/step - loss: 0.0554 - acc: 0.9791 - val_loss: 0.0542 - val_acc: 0.9804
Epoch 00013: early stopping

测试并输出CSV

In [15]:
test_generator_to_csv(final_model, (224,224), csv_filename=get_filename("ResNet50", "fine", "trainable_layers_training",ext="csv"))
Found 12500 images belonging to 1 classes.
98/98 [==============================] - 110s 1s/step
Found 12500 images belonging to 1 classes.
   id     label
0   1  0.992661
1   2  0.995000
2   3  0.995000
3   4  0.995000
4   5  0.005000
5   6  0.005000
6   7  0.005000
7   8  0.005000
8   9  0.005000
9  10  0.005000
CSV文件已保存至csv_output/fine_ResNet50_trainable_layers_training.csv
/home/ubuntu/cats_dogs/helper.py:484: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
  df.set_value(index - 1, 'label', predict[i])

独立Finetuning-不分步骤(该步骤与上方独立)

In [5]:
model,history = run_finetuning_model(ResNet50,(224,224),
                     n_trainable_layer=11,
                     preprocess_func=resnet50.preprocess_input,
                     epochs_max=20,
                     augmented=True,
                     apply_pretrain_weights=True)
Creating Model...
model created
ResNet50 pretrained output weights applyed to new model:[array([[ 0.02405468],
       [ 0.04063614],
       [-0.007013  ],
       ...,
       [ 0.02188864],
       [-0.09470217],
       [ 0.12757942]], dtype=float32), array([0.05355521], dtype=float32)]
Found 20000 images belonging to 2 classes.
Found 5000 images belonging to 2 classes.
Fitting Model...
Epoch 1/20
157/157 [==============================] - 278s 2s/step - loss: 0.0894 - acc: 0.9658 - val_loss: 0.0424 - val_acc: 0.9850
Epoch 2/20
157/157 [==============================] - 241s 2s/step - loss: 0.0784 - acc: 0.9695 - val_loss: 0.0483 - val_acc: 0.9836
Epoch 3/20
157/157 [==============================] - 243s 2s/step - loss: 0.0706 - acc: 0.9730 - val_loss: 0.0477 - val_acc: 0.9836
Epoch 4/20
157/157 [==============================] - 244s 2s/step - loss: 0.0654 - acc: 0.9744 - val_loss: 0.0346 - val_acc: 0.9862
Epoch 5/20
157/157 [==============================] - 242s 2s/step - loss: 0.0639 - acc: 0.9749 - val_loss: 0.0462 - val_acc: 0.9848
Epoch 6/20
157/157 [==============================] - 243s 2s/step - loss: 0.0620 - acc: 0.9757 - val_loss: 0.0362 - val_acc: 0.9872
Epoch 7/20
157/157 [==============================] - 242s 2s/step - loss: 0.0587 - acc: 0.9789 - val_loss: 0.0380 - val_acc: 0.9860
Epoch 8/20
157/157 [==============================] - 245s 2s/step - loss: 0.0581 - acc: 0.9765 - val_loss: 0.0366 - val_acc: 0.9868
Epoch 00008: early stopping
Testing Model...
Found 12500 images belonging to 1 classes.
98/98 [==============================] - 110s 1s/step
Found 12500 images belonging to 1 classes.
   id  label
0   1  0.995
1   2  0.995
2   3  0.995
3   4  0.995
4   5  0.005
5   6  0.005
6   7  0.005
7   8  0.005
8   9  0.005
9  10  0.005
CSV文件已保存至csv_output/fine_ResNet50_loadweight_lastopen11_augmented.csv
/home/ubuntu/cats_dogs/helper.py:484: FutureWarning: set_value is deprecated and will be removed in a future release. Please use .at[] or .iat[] accessors instead
  df.set_value(index - 1, 'label', predict[i])

验证-显示模型预测产生最多loss的图片

In [3]:
model = create_IncXceRes_model_merge()
model created
In [35]:
SVG(model_to_dot(model, show_shapes=True).create(prog='dot', format='svg'))
Out[35]:
G 6760192264 input_10: InputLayer input: output: (None, 224, 224, 3) (None, 224, 224, 3) 5512645264 lambda_6: Lambda input: output: (None, 224, 224, 3) (None, 224, 224, 3) 6760192264->5512645264 6830636392 conv1: Conv2D input: output: (None, 224, 224, 3) (None, 112, 112, 64) 5512645264->6830636392 6843107088 bn_conv1: BatchNormalization input: output: (None, 112, 112, 64) (None, 112, 112, 64) 6830636392->6843107088 6898238912 activation_667: Activation input: output: (None, 112, 112, 64) (None, 112, 112, 64) 6843107088->6898238912 6843876464 max_pooling2d_25: MaxPooling2D input: output: (None, 112, 112, 64) (None, 55, 55, 64) 6898238912->6843876464 6962937416 res2a_branch2a: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6843876464->6962937416 6566125464 res2a_branch1: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 6843876464->6566125464 6843647480 bn2a_branch2a: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6962937416->6843647480 6963289672 activation_668: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6843647480->6963289672 6564552152 res2a_branch2b: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6963289672->6564552152 6760192376 input_9: InputLayer input: output: (None, 299, 299, 3) (None, 299, 299, 3) 6760139072 lambda_5: Lambda input: output: (None, 299, 299, 3) (None, 299, 299, 3) 6760192376->6760139072 6565080760 bn2a_branch2b: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6564552152->6565080760 6760163760 conv2d_393: Conv2D input: output: (None, 299, 299, 3) (None, 149, 149, 32) 6760139072->6760163760 6609682560 block1_conv1: Conv2D input: output: (None, 299, 299, 3) (None, 149, 149, 32) 6760139072->6609682560 6565547200 activation_669: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6565080760->6565547200 6601522088 batch_normalization_393: BatchNormalization input: output: (None, 149, 149, 32) (None, 149, 149, 32) 6760163760->6601522088 6565613296 res2a_branch2c: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 6565547200->6565613296 6760583744 activation_573: Activation input: output: (None, 149, 149, 32) (None, 149, 149, 32) 6601522088->6760583744 6565874768 bn2a_branch2c: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6565613296->6565874768 6566405456 bn2a_branch1: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6566125464->6566405456 6760177112 conv2d_394: Conv2D input: output: (None, 149, 149, 32) (None, 147, 147, 32) 6760583744->6760177112 6755788504 add_125: Add input: output: [(None, 55, 55, 256), (None, 55, 55, 256)] (None, 55, 55, 256) 6565874768->6755788504 6566405456->6755788504 6761512184 batch_normalization_394: BatchNormalization input: output: (None, 147, 147, 32) (None, 147, 147, 32) 6760177112->6761512184 6775253368 activation_670: Activation input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6755788504->6775253368 6760408680 activation_574: Activation input: output: (None, 147, 147, 32) (None, 147, 147, 32) 6761512184->6760408680 6775314976 res2b_branch2a: Conv2D input: output: (None, 55, 55, 256) (None, 55, 55, 64) 6775253368->6775314976 6830837040 add_126: Add input: output: [(None, 55, 55, 256), (None, 55, 55, 256)] (None, 55, 55, 256) 6775253368->6830837040 6760471912 conv2d_395: Conv2D input: output: (None, 147, 147, 32) (None, 147, 147, 64) 6760408680->6760471912 6787316200 bn2b_branch2a: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6775314976->6787316200 6557871912 batch_normalization_395: BatchNormalization input: output: (None, 147, 147, 64) (None, 147, 147, 64) 6760471912->6557871912 6787458328 activation_671: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6787316200->6787458328 6600755520 activation_575: Activation input: output: (None, 147, 147, 64) (None, 147, 147, 64) 6557871912->6600755520 6787520496 res2b_branch2b: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6787458328->6787520496 6755325656 max_pooling2d_21: MaxPooling2D input: output: (None, 147, 147, 64) (None, 73, 73, 64) 6600755520->6755325656 6787788304 bn2b_branch2b: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6787520496->6787788304 6760849248 conv2d_396: Conv2D input: output: (None, 73, 73, 64) (None, 73, 73, 80) 6755325656->6760849248 6829081320 activation_672: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6787788304->6829081320 6761662112 batch_normalization_396: BatchNormalization input: output: (None, 73, 73, 80) (None, 73, 73, 80) 6760849248->6761662112 6829215304 res2b_branch2c: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 6829081320->6829215304 6750751824 activation_576: Activation input: output: (None, 73, 73, 80) (None, 73, 73, 80) 6761662112->6750751824 6829477672 bn2b_branch2c: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6829215304->6829477672 6750752160 conv2d_397: Conv2D input: output: (None, 73, 73, 80) (None, 71, 71, 192) 6750751824->6750752160 6829477672->6830837040 6754728872 batch_normalization_397: BatchNormalization input: output: (None, 71, 71, 192) (None, 71, 71, 192) 6750752160->6754728872 6830903080 activation_673: Activation input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6830837040->6830903080 6755020360 activation_577: Activation input: output: (None, 71, 71, 192) (None, 71, 71, 192) 6754728872->6755020360 6831032024 res2c_branch2a: Conv2D input: output: (None, 55, 55, 256) (None, 55, 55, 64) 6830903080->6831032024 6882934400 add_127: Add input: output: [(None, 55, 55, 256), (None, 55, 55, 256)] (None, 55, 55, 256) 6830903080->6882934400 6757620760 max_pooling2d_22: MaxPooling2D input: output: (None, 71, 71, 192) (None, 35, 35, 192) 6755020360->6757620760 6831415080 bn2c_branch2a: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6831032024->6831415080 6203602816 conv2d_401: Conv2D input: output: (None, 35, 35, 192) (None, 35, 35, 64) 6757620760->6203602816 6759072208 conv2d_399: Conv2D input: output: (None, 35, 35, 192) (None, 35, 35, 48) 6757620760->6759072208 6749972184 average_pooling2d_37: AveragePooling2D input: output: (None, 35, 35, 192) (None, 35, 35, 192) 6757620760->6749972184 6761012192 conv2d_398: Conv2D input: output: (None, 35, 35, 192) (None, 35, 35, 64) 6757620760->6761012192 6835821256 activation_674: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6831415080->6835821256 6440832136 batch_normalization_401: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6203602816->6440832136 6835883424 res2c_branch2b: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6835821256->6835883424 6750484128 activation_581: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6440832136->6750484128 6836145792 bn2c_branch2b: BatchNormalization input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6835883424->6836145792 6315658264 conv2d_402: Conv2D input: output: (None, 35, 35, 64) (None, 35, 35, 96) 6750484128->6315658264 6882356360 activation_675: Activation input: output: (None, 55, 55, 64) (None, 55, 55, 64) 6836145792->6882356360 6758059480 batch_normalization_399: BatchNormalization input: output: (None, 35, 35, 48) (None, 35, 35, 48) 6759072208->6758059480 6204085976 batch_normalization_402: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6315658264->6204085976 6882422456 res2c_branch2c: Conv2D input: output: (None, 55, 55, 64) (None, 55, 55, 256) 6882356360->6882422456 6592279552 activation_579: Activation input: output: (None, 35, 35, 48) (None, 35, 35, 48) 6758059480->6592279552 5653669480 activation_582: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6204085976->5653669480 6882688024 bn2c_branch2c: BatchNormalization input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6882422456->6882688024 6592141800 conv2d_400: Conv2D input: output: (None, 35, 35, 48) (None, 35, 35, 64) 6592279552->6592141800 5653515624 conv2d_403: Conv2D input: output: (None, 35, 35, 96) (None, 35, 35, 96) 5653669480->5653515624 6721716064 conv2d_404: Conv2D input: output: (None, 35, 35, 192) (None, 35, 35, 32) 6749972184->6721716064 6882688024->6882934400 6758989776 batch_normalization_398: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6761012192->6758989776 6315932248 batch_normalization_400: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6592141800->6315932248 6750187304 batch_normalization_403: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 5653515624->6750187304 6706788000 batch_normalization_404: BatchNormalization input: output: (None, 35, 35, 32) (None, 35, 35, 32) 6721716064->6706788000 6883218712 activation_676: Activation input: output: (None, 55, 55, 256) (None, 55, 55, 256) 6882934400->6883218712 6759075008 activation_578: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6758989776->6759075008 6203264248 activation_580: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6315932248->6203264248 6750120256 activation_583: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6750187304->6750120256 6673468496 activation_584: Activation input: output: (None, 35, 35, 32) (None, 35, 35, 32) 6706788000->6673468496 6883286152 res3a_branch2a: Conv2D input: output: (None, 55, 55, 256) (None, 28, 28, 128) 6883218712->6883286152 6887445728 res3a_branch1: Conv2D input: output: (None, 55, 55, 256) (None, 28, 28, 512) 6883218712->6887445728 6673468832 mixed0: Concatenate input: output: [(None, 35, 35, 64), (None, 35, 35, 64), (None, 35, 35, 96), (None, 35, 35, 32)] (None, 35, 35, 256) 6759075008->6673468832 6203264248->6673468832 6750120256->6673468832 6673468496->6673468832 6883763816 bn3a_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6883286152->6883763816 5653417768 conv2d_408: Conv2D input: output: (None, 35, 35, 256) (None, 35, 35, 64) 6673468832->5653417768 6440513488 conv2d_406: Conv2D input: output: (None, 35, 35, 256) (None, 35, 35, 48) 6673468832->6440513488 6724389576 average_pooling2d_38: AveragePooling2D input: output: (None, 35, 35, 256) (None, 35, 35, 256) 6673468832->6724389576 6673468552 conv2d_405: Conv2D input: output: (None, 35, 35, 256) (None, 35, 35, 64) 6673468832->6673468552 6883904480 activation_677: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6883763816->6883904480 6545365760 batch_normalization_408: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 5653417768->6545365760 6883958288 res3a_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6883904480->6883958288 6749625760 activation_588: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6545365760->6749625760 6884236144 bn3a_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6883958288->6884236144 6597311344 conv2d_409: Conv2D input: output: (None, 35, 35, 64) (None, 35, 35, 96) 6749625760->6597311344 6886620688 activation_678: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6884236144->6886620688 6591426744 batch_normalization_406: BatchNormalization input: output: (None, 35, 35, 48) (None, 35, 35, 48) 6440513488->6591426744 6725352752 batch_normalization_409: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6597311344->6725352752 6886832784 res3a_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 6886620688->6886832784 6581439232 activation_586: Activation input: output: (None, 35, 35, 48) (None, 35, 35, 48) 6591426744->6581439232 6749308744 activation_589: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6725352752->6749308744 6887107272 bn3a_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6886832784->6887107272 6887635656 bn3a_branch1: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6887445728->6887635656 6568170888 conv2d_407: Conv2D input: output: (None, 35, 35, 48) (None, 35, 35, 64) 6581439232->6568170888 6725039328 conv2d_410: Conv2D input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6749308744->6725039328 6723759576 conv2d_411: Conv2D input: output: (None, 35, 35, 256) (None, 35, 35, 64) 6724389576->6723759576 6888316152 add_128: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 6887107272->6888316152 6887635656->6888316152 6440695064 batch_normalization_405: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6673468552->6440695064 6441210712 batch_normalization_407: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6568170888->6441210712 6724513184 batch_normalization_410: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6725039328->6724513184 6724923288 batch_normalization_411: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6723759576->6724923288 6609682616 block1_conv1_bn: BatchNormalization input: output: (None, 149, 149, 32) (None, 149, 149, 32) 6609682560->6609682616 6888386288 activation_679: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6888316152->6888386288 6440512200 activation_585: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6440695064->6440512200 5653461536 activation_587: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6441210712->5653461536 6724341944 activation_590: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6724513184->6724341944 6227881712 activation_591: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6724923288->6227881712 6608215568 block1_conv1_act: Activation input: output: (None, 149, 149, 32) (None, 149, 149, 32) 6609682616->6608215568 6888507040 res3b_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 28, 28, 128) 6888386288->6888507040 6890446464 add_129: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 6888386288->6890446464 6227878856 mixed1: Concatenate input: output: [(None, 35, 35, 64), (None, 35, 35, 64), (None, 35, 35, 96), (None, 35, 35, 64)] (None, 35, 35, 288) 6440512200->6227878856 5653461536->6227878856 6724341944->6227878856 6227881712->6227878856 6608215960 block1_conv2: Conv2D input: output: (None, 149, 149, 32) (None, 147, 147, 64) 6608215568->6608215960 6888902384 bn3b_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6888507040->6888902384 6285760048 conv2d_415: Conv2D input: output: (None, 35, 35, 288) (None, 35, 35, 64) 6227878856->6285760048 5126846168 conv2d_413: Conv2D input: output: (None, 35, 35, 288) (None, 35, 35, 48) 6227878856->5126846168 6287914376 average_pooling2d_39: AveragePooling2D input: output: (None, 35, 35, 288) (None, 35, 35, 288) 6227878856->6287914376 6227937320 conv2d_412: Conv2D input: output: (None, 35, 35, 288) (None, 35, 35, 64) 6227878856->6227937320 6608411616 block1_conv2_bn: BatchNormalization input: output: (None, 147, 147, 64) (None, 147, 147, 64) 6608215960->6608411616 6889069200 activation_680: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6888902384->6889069200 6286027576 batch_normalization_415: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6285760048->6286027576 6609301120 block1_conv2_act: Activation input: output: (None, 147, 147, 64) (None, 147, 147, 64) 6608411616->6609301120 6889119080 res3b_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6889069200->6889119080 6286510792 activation_595: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6286027576->6286510792 6767369856 block2_sepconv1: SeparableConv2D input: output: (None, 147, 147, 64) (None, 147, 147, 128) 6609301120->6767369856 6609489872 conv2d_487: Conv2D input: output: (None, 147, 147, 64) (None, 74, 74, 128) 6609301120->6609489872 6889389640 bn3b_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6889119080->6889389640 6286442448 conv2d_416: Conv2D input: output: (None, 35, 35, 64) (None, 35, 35, 96) 6286510792->6286442448 6770758936 block2_sepconv1_bn: BatchNormalization input: output: (None, 147, 147, 128) (None, 147, 147, 128) 6767369856->6770758936 6889868368 activation_681: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6889389640->6889868368 6144388568 batch_normalization_413: BatchNormalization input: output: (None, 35, 35, 48) (None, 35, 35, 48) 5126846168->6144388568 6286777256 batch_normalization_416: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6286442448->6286777256 6142619544 block2_sepconv2_act: Activation input: output: (None, 147, 147, 128) (None, 147, 147, 128) 6770758936->6142619544 6889930368 res3b_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 6889868368->6889930368 6205039280 activation_593: Activation input: output: (None, 35, 35, 48) (None, 35, 35, 48) 6144388568->6205039280 6287019592 activation_596: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6286777256->6287019592 6142954072 block2_sepconv2: SeparableConv2D input: output: (None, 147, 147, 128) (None, 147, 147, 128) 6142619544->6142954072 6890191840 bn3b_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6889930368->6890191840 6227361016 conv2d_414: Conv2D input: output: (None, 35, 35, 48) (None, 35, 35, 64) 6205039280->6227361016 6287178776 conv2d_417: Conv2D input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6287019592->6287178776 6288025640 conv2d_418: Conv2D input: output: (None, 35, 35, 288) (None, 35, 35, 64) 6287914376->6288025640 6282921072 block2_sepconv2_bn: BatchNormalization input: output: (None, 147, 147, 128) (None, 147, 147, 128) 6142954072->6282921072 6890191840->6890446464 6285080000 batch_normalization_412: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6227937320->6285080000 6227619736 batch_normalization_414: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6227361016->6227619736 6287441368 batch_normalization_417: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6287178776->6287441368 6288096896 batch_normalization_418: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6288025640->6288096896 6282993000 block2_pool: MaxPooling2D input: output: (None, 147, 147, 128) (None, 74, 74, 128) 6282921072->6282993000 6747605984 batch_normalization_487: BatchNormalization input: output: (None, 74, 74, 128) (None, 74, 74, 128) 6609489872->6747605984 6893683936 activation_682: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6890446464->6893683936 5126787024 activation_592: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6285080000->5126787024 6285609896 activation_594: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6227619736->6285609896 6287767536 activation_597: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6287441368->6287767536 6293027920 activation_598: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6288096896->6293027920 6581937992 add_113: Add input: output: [(None, 74, 74, 128), (None, 74, 74, 128)] (None, 74, 74, 128) 6282993000->6581937992 6747605984->6581937992 6893759568 res3c_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 28, 28, 128) 6893683936->6893759568 6895764648 add_130: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 6893683936->6895764648 6293027640 mixed2: Concatenate input: output: [(None, 35, 35, 64), (None, 35, 35, 64), (None, 35, 35, 96), (None, 35, 35, 64)] (None, 35, 35, 288) 5126787024->6293027640 6285609896->6293027640 6287767536->6293027640 6293027920->6293027640 6668672864 block3_sepconv1_act: Activation input: output: (None, 74, 74, 128) (None, 74, 74, 128) 6581937992->6668672864 6582013624 conv2d_488: Conv2D input: output: (None, 74, 74, 128) (None, 37, 37, 256) 6581937992->6582013624 6894233136 bn3c_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6893759568->6894233136 6293743824 conv2d_420: Conv2D input: output: (None, 35, 35, 288) (None, 35, 35, 64) 6293027640->6293743824 6293081952 conv2d_419: Conv2D input: output: (None, 35, 35, 288) (None, 17, 17, 384) 6293027640->6293081952 6295893720 max_pooling2d_23: MaxPooling2D input: output: (None, 35, 35, 288) (None, 17, 17, 288) 6293027640->6295893720 6668721064 block3_sepconv1: SeparableConv2D input: output: (None, 74, 74, 128) (None, 74, 74, 256) 6668672864->6668721064 6894369704 activation_683: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6894233136->6894369704 6294011352 batch_normalization_420: BatchNormalization input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6293743824->6294011352 6669687048 block3_sepconv1_bn: BatchNormalization input: output: (None, 74, 74, 256) (None, 74, 74, 256) 6668721064->6669687048 6894427608 res3c_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6894369704->6894427608 6294333872 activation_600: Activation input: output: (None, 35, 35, 64) (None, 35, 35, 64) 6294011352->6294333872 6668983880 block3_sepconv2_act: Activation input: output: (None, 74, 74, 256) (None, 74, 74, 256) 6669687048->6668983880 6894705464 bn3c_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6894427608->6894705464 6294488960 conv2d_421: Conv2D input: output: (None, 35, 35, 64) (None, 35, 35, 96) 6294333872->6294488960 6669820088 block3_sepconv2: SeparableConv2D input: output: (None, 74, 74, 256) (None, 74, 74, 256) 6668983880->6669820088 6894947800 activation_684: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6894705464->6894947800 6294752896 batch_normalization_421: BatchNormalization input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6294488960->6294752896 6677129928 block3_sepconv2_bn: BatchNormalization input: output: (None, 74, 74, 256) (None, 74, 74, 256) 6669820088->6677129928 6895159896 res3c_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 6894947800->6895159896 6295085912 activation_601: Activation input: output: (None, 35, 35, 96) (None, 35, 35, 96) 6294752896->6295085912 6676798152 block3_pool: MaxPooling2D input: output: (None, 74, 74, 256) (None, 37, 37, 256) 6677129928->6676798152 6582081128 batch_normalization_488: BatchNormalization input: output: (None, 37, 37, 256) (None, 37, 37, 256) 6582013624->6582081128 6895420248 bn3c_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6895159896->6895420248 6295162272 conv2d_422: Conv2D input: output: (None, 35, 35, 96) (None, 17, 17, 96) 6295085912->6295162272 6677437856 add_114: Add input: output: [(None, 37, 37, 256), (None, 37, 37, 256)] (None, 37, 37, 256) 6676798152->6677437856 6582081128->6677437856 6895420248->6895764648 6293689288 batch_normalization_419: BatchNormalization input: output: (None, 17, 17, 384) (None, 17, 17, 384) 6293081952->6293689288 6295493040 batch_normalization_422: BatchNormalization input: output: (None, 17, 17, 96) (None, 17, 17, 96) 6295162272->6295493040 6707244392 block4_sepconv1_act: Activation input: output: (None, 37, 37, 256) (None, 37, 37, 256) 6677437856->6707244392 6706911888 conv2d_489: Conv2D input: output: (None, 37, 37, 256) (None, 19, 19, 728) 6677437856->6706911888 6895962768 activation_685: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6895764648->6895962768 6293745616 activation_599: Activation input: output: (None, 17, 17, 384) (None, 17, 17, 384) 6293689288->6293745616 6295739472 activation_602: Activation input: output: (None, 17, 17, 96) (None, 17, 17, 96) 6295493040->6295739472 6707242488 block4_sepconv1: SeparableConv2D input: output: (None, 37, 37, 256) (None, 37, 37, 728) 6707244392->6707242488 6896020280 res3d_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 28, 28, 128) 6895962768->6896020280 6900535248 add_131: Add input: output: [(None, 28, 28, 512), (None, 28, 28, 512)] (None, 28, 28, 512) 6895962768->6900535248 6296312296 mixed3: Concatenate input: output: [(None, 17, 17, 384), (None, 17, 17, 96), (None, 17, 17, 288)] (None, 17, 17, 768) 6293745616->6296312296 6295739472->6296312296 6295893720->6296312296 6707814240 block4_sepconv1_bn: BatchNormalization input: output: (None, 37, 37, 728) (None, 37, 37, 728) 6707242488->6707814240 6896491264 bn3d_branch2a: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6896020280->6896491264 6302336392 conv2d_427: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 128) 6296312296->6302336392 6296630104 conv2d_424: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 128) 6296312296->6296630104 6305888016 average_pooling2d_40: AveragePooling2D input: output: (None, 17, 17, 768) (None, 17, 17, 768) 6296312296->6305888016 6296157208 conv2d_423: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6296312296->6296157208 6707486560 block4_sepconv2_act: Activation input: output: (None, 37, 37, 728) (None, 37, 37, 728) 6707814240->6707486560 6896641584 activation_686: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6896491264->6896641584 6302754184 batch_normalization_427: BatchNormalization input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6302336392->6302754184 6707955472 block4_sepconv2: SeparableConv2D input: output: (None, 37, 37, 728) (None, 37, 37, 728) 6707486560->6707955472 6896700608 res3d_branch2b: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6896641584->6896700608 6302953880 activation_607: Activation input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6302754184->6302953880 6719938912 block4_sepconv2_bn: BatchNormalization input: output: (None, 37, 37, 728) (None, 37, 37, 728) 6707955472->6719938912 6896971224 bn3d_branch2b: BatchNormalization input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6896700608->6896971224 6303018008 conv2d_428: Conv2D input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6302953880->6303018008 6708272880 block4_pool: MaxPooling2D input: output: (None, 37, 37, 728) (None, 19, 19, 728) 6719938912->6708272880 6682002768 batch_normalization_489: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6706911888->6682002768 6897300480 activation_687: Activation input: output: (None, 28, 28, 128) (None, 28, 28, 128) 6896971224->6897300480 6303268424 batch_normalization_428: BatchNormalization input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6303018008->6303268424 6720179504 add_115: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6708272880->6720179504 6682002768->6720179504 6897442600 res3d_branch2c: Conv2D input: output: (None, 28, 28, 128) (None, 28, 28, 512) 6897300480->6897442600 6303595320 activation_608: Activation input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6303268424->6303595320 6720309848 block5_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6720179504->6720309848 6722735072 add_116: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6720179504->6722735072 6900055512 bn3d_branch2c: BatchNormalization input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6897442600->6900055512 6303737280 conv2d_429: Conv2D input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6303595320->6303737280 6720395808 block5_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6720309848->6720395808 6900055512->6900535248 6296890168 batch_normalization_424: BatchNormalization input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6296630104->6296890168 6304000712 batch_normalization_429: BatchNormalization input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6303737280->6304000712 6721191328 block5_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6720395808->6721191328 6900600224 activation_688: Activation input: output: (None, 28, 28, 512) (None, 28, 28, 512) 6900535248->6900600224 6300928712 activation_604: Activation input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6296890168->6300928712 6304478824 activation_609: Activation input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6304000712->6304478824 6721191888 block5_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6721191328->6721191888 6900729840 res4a_branch2a: Conv2D input: output: (None, 28, 28, 512) (None, 14, 14, 256) 6900600224->6900729840 6941835616 res4a_branch1: Conv2D input: output: (None, 28, 28, 512) (None, 14, 14, 1024) 6900600224->6941835616 6300860368 conv2d_425: Conv2D input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6300928712->6300860368 6304411544 conv2d_430: Conv2D input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6304478824->6304411544 6721299736 block5_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6721191888->6721299736 6901223888 bn4a_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6900729840->6901223888 6301207464 batch_normalization_425: BatchNormalization input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6300860368->6301207464 6304742368 batch_normalization_430: BatchNormalization input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6304411544->6304742368 6722284064 block5_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6721299736->6722284064 6901279712 activation_689: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6901223888->6901279712 6301445704 activation_605: Activation input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6301207464->6301445704 6304985040 activation_610: Activation input: output: (None, 17, 17, 128) (None, 17, 17, 128) 6304742368->6304985040 6721965584 block5_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6722284064->6721965584 6901341880 res4a_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6901279712->6901341880 6301608984 conv2d_426: Conv2D input: output: (None, 17, 17, 128) (None, 17, 17, 192) 6301445704->6301608984 6305148152 conv2d_431: Conv2D input: output: (None, 17, 17, 128) (None, 17, 17, 192) 6304985040->6305148152 6306074520 conv2d_432: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6305888016->6306074520 6722421200 block5_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6721965584->6722421200 6901603744 bn4a_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6901341880->6901603744 6296571856 batch_normalization_423: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6296157208->6296571856 6301867480 batch_normalization_426: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6301608984->6301867480 6305410744 batch_normalization_431: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6305148152->6305410744 6306305920 batch_normalization_432: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6306074520->6306305920 6723083232 block5_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6722421200->6723083232 6902070968 activation_690: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6901603744->6902070968 6296632680 activation_603: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6296571856->6296632680 6302197744 activation_606: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6301867480->6302197744 6305741176 activation_611: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6305410744->6305741176 6306553136 activation_612: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6306305920->6306553136 6723083232->6722735072 6902140880 res4a_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 6902070968->6902140880 6306553472 mixed4: Concatenate input: output: [(None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192)] (None, 17, 17, 768) 6296632680->6306553472 6302197744->6306553472 6305741176->6306553472 6306553136->6306553472 6720394072 block6_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6722735072->6720394072 6752201416 add_117: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6722735072->6752201416 6902402352 bn4a_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6902140880->6902402352 6942021168 bn4a_branch1: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6941835616->6942021168 6441409560 conv2d_437: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 160) 6306553472->6441409560 6349200464 conv2d_434: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 160) 6306553472->6349200464 6515527128 average_pooling2d_41: AveragePooling2D input: output: (None, 17, 17, 768) (None, 17, 17, 768) 6306553472->6515527128 6306553192 conv2d_433: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6306553472->6306553192 6747071432 block6_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6720394072->6747071432 6942558136 add_132: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 6902402352->6942558136 6942021168->6942558136 6441655880 batch_normalization_437: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6441409560->6441655880 6748471248 block6_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6747071432->6748471248 6942759512 activation_691: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6942558136->6942759512 6441982776 activation_617: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6441655880->6441982776 6748280984 block6_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6748471248->6748280984 6942817024 res4b_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 6942759512->6942817024 6944964504 add_133: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 6942759512->6944964504 6442132928 conv2d_438: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6441982776->6442132928 6748415928 block6_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6748280984->6748415928 6943288008 bn4b_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6942817024->6943288008 6442400456 batch_normalization_438: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6442132928->6442400456 6751740424 block6_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6748415928->6751740424 6943430136 activation_692: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6943288008->6943430136 6442878568 activation_618: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6442400456->6442878568 6751330200 block6_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6751740424->6751330200 6943493256 res4b_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6943430136->6943493256 6442815384 conv2d_439: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6442878568->6442815384 6751813472 block6_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6751330200->6751813472 6943760224 bn4b_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6943493256->6943760224 6349535720 batch_normalization_434: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6349200464->6349535720 6443146208 batch_normalization_439: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6442815384->6443146208 6752523208 block6_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6751813472->6752523208 6944097224 activation_693: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6943760224->6944097224 6349782320 activation_614: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6349535720->6349782320 6443392976 activation_619: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6443146208->6443392976 6752523208->6752201416 6944239400 res4b_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 6944097224->6944239400 6349930392 conv2d_435: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6349782320->6349930392 6443547896 conv2d_440: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6443392976->6443547896 6747246264 block7_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6752201416->6747246264 6757411696 add_118: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6752201416->6757411696 6944501152 bn4b_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6944239400->6944501152 6350187712 batch_normalization_435: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6349930392->6350187712 6443814584 batch_normalization_440: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6443547896->6443814584 6752595864 block7_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6747246264->6752595864 6944501152->6944964504 6350526336 activation_615: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6350187712->6350526336 6444140920 activation_620: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6443814584->6444140920 6756248880 block7_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6752595864->6756248880 6945042048 activation_694: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6944964504->6945042048 6350677272 conv2d_436: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 192) 6350526336->6350677272 6444279568 conv2d_441: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 192) 6444140920->6444279568 6518065696 conv2d_442: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6515527128->6518065696 6756249496 block7_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6756248880->6756249496 6945163192 res4c_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 6945042048->6945163192 6949316200 add_134: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 6945042048->6949316200 6306735776 batch_normalization_433: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6306553192->6306735776 6351183712 batch_normalization_436: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6350677272->6351183712 6444551696 batch_normalization_441: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6444279568->6444551696 6518206192 batch_normalization_442: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6518065696->6518206192 6756357624 block7_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6756249496->6756357624 6947766680 bn4c_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6945163192->6947766680 6349200688 activation_613: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6306735776->6349200688 6441337240 activation_616: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6351183712->6441337240 6515442576 activation_621: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6444551696->6515442576 6518533928 activation_622: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6518206192->6518533928 6756956592 block7_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6756357624->6756956592 6947830696 activation_695: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6947766680->6947830696 6518533536 mixed5: Concatenate input: output: [(None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192)] (None, 17, 17, 768) 6349200688->6518533536 6441337240->6518533536 6515442576->6518533536 6518533928->6518533536 6756630368 block7_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6756956592->6756630368 6947896960 res4c_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6947830696->6947896960 6542341624 conv2d_447: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 160) 6518533536->6542341624 6519545808 conv2d_444: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 160) 6518533536->6519545808 6551580624 average_pooling2d_42: AveragePooling2D input: output: (None, 17, 17, 768) (None, 17, 17, 768) 6518533536->6551580624 6518648616 conv2d_443: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6518533536->6518648616 6757089632 block7_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6756630368->6757089632 6948163424 bn4c_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6947896960->6948163424 6544321280 batch_normalization_447: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6542341624->6544321280 6772358000 block7_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6757089632->6772358000 6948637944 activation_696: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6948163424->6948637944 6544998352 activation_627: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6544321280->6544998352 6772358000->6757411696 6948700056 res4c_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 6948637944->6948700056 6545050424 conv2d_448: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6544998352->6545050424 6752839328 block8_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6757411696->6752839328 6774463064 add_119: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6757411696->6774463064 6948957432 bn4c_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6948700056->6948957432 6545583128 batch_normalization_448: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6545050424->6545583128 6772507480 block8_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6752839328->6772507480 6948957432->6949316200 6545929776 activation_628: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6545583128->6545929776 6773341936 block8_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6772507480->6773341936 6949496312 activation_697: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6949316200->6949496312 6545984816 conv2d_449: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6545929776->6545984816 6773151784 block8_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6773341936->6773151784 6949559656 res4d_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 6949496312->6949559656 6951581120 add_135: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 6949496312->6951581120 6519794936 batch_normalization_444: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6519545808->6519794936 6546251504 batch_normalization_449: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6545984816->6546251504 6773286728 block8_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6773151784->6773286728 6950029128 bn4d_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6949559656->6950029128 6521333688 activation_624: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6519794936->6521333688 6546577840 activation_629: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6546251504->6546577840 6774006168 block8_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6773286728->6774006168 6950173888 activation_698: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6950029128->6950173888 6521472336 conv2d_445: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6521333688->6521472336 6548338504 conv2d_450: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6546577840->6548338504 6773591848 block8_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6774006168->6773591848 6950235888 res4d_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6950173888->6950235888 6541354432 batch_normalization_445: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6521472336->6541354432 6548610632 batch_normalization_450: BatchNormalization input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6548338504->6548610632 6774087408 block8_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6773591848->6774087408 6950501456 bn4d_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6950235888->6950501456 6541537744 activation_625: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6541354432->6541537744 6548952008 activation_630: Activation input: output: (None, 17, 17, 160) (None, 17, 17, 160) 6548610632->6548952008 6774805336 block8_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6774087408->6774805336 6950756248 activation_699: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6950501456->6950756248 6541605968 conv2d_446: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 192) 6541537744->6541605968 6549032464 conv2d_451: Conv2D input: output: (None, 17, 17, 160) (None, 17, 17, 192) 6548952008->6549032464 6552812456 conv2d_452: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6551580624->6552812456 6774805336->6774463064 6950984560 res4d_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 6950756248->6950984560 6519392744 batch_normalization_443: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6518648616->6519392744 6541860536 batch_normalization_446: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6541605968->6541860536 6551177760 batch_normalization_451: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6549032464->6551177760 6553067416 batch_normalization_452: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6552812456->6553067416 6772678216 block9_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6774463064->6772678216 6781917952 add_120: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6774463064->6781917952 6951250856 bn4d_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6950984560->6951250856 6519544632 activation_623: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6519392744->6519544632 6542183280 activation_626: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6541860536->6542183280 6551424360 activation_631: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6551177760->6551424360 6553137104 activation_632: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6553067416->6553137104 6774869072 block9_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6772678216->6774869072 6951250856->6951581120 6553135088 mixed6: Concatenate input: output: [(None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192)] (None, 17, 17, 768) 6519544632->6553135088 6542183280->6553135088 6551424360->6553135088 6553137104->6553135088 6776372416 block9_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6774869072->6776372416 6954736552 activation_700: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6951581120->6954736552 6559062896 conv2d_457: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6553135088->6559062896 6554183048 conv2d_454: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6553135088->6554183048 6563706248 average_pooling2d_43: AveragePooling2D input: output: (None, 17, 17, 768) (None, 17, 17, 768) 6553135088->6563706248 6553246408 conv2d_453: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6553135088->6553246408 6776373032 block9_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6776372416->6776373032 6954785872 res4e_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 6954736552->6954785872 6958930744 add_136: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 6954736552->6958930744 6559321168 batch_normalization_457: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6559062896->6559321168 6776489352 block9_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6776373032->6776489352 6955260952 bn4e_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6954785872->6955260952 6559667816 activation_637: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6559321168->6559667816 6777080128 block9_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6776489352->6777080128 6955419464 activation_701: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6955260952->6955419464 6559735144 conv2d_458: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6559667816->6559735144 6776749696 block9_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6777080128->6776749696 6955477704 res4e_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6955419464->6955477704 6559997736 batch_normalization_458: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6559735144->6559997736 6777221360 block9_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6776749696->6777221360 6955748096 bn4e_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6955477704->6955748096 6560328112 activation_638: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6559997736->6560328112 6782245632 block9_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6777221360->6782245632 6956086552 activation_702: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6955748096->6956086552 6560483200 conv2d_459: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6560328112->6560483200 6782245632->6781917952 6956230528 res4e_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 6956086552->6956230528 6554600952 batch_normalization_454: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6554183048->6554600952 6560743040 batch_normalization_459: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6560483200->6560743040 6775109168 block10_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6781917952->6775109168 6826494440 add_121: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6781917952->6826494440 6958604072 bn4e_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6956230528->6958604072 6554780168 activation_634: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6554600952->6554780168 6561068032 activation_639: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6560743040->6561068032 6782382824 block10_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6775109168->6782382824 6958604072->6958930744 6554853328 conv2d_455: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6554780168->6554853328 6561152584 conv2d_460: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6561068032->6561152584 6786235584 block10_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6782382824->6786235584 6959140536 activation_703: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6958930744->6959140536 6555115248 batch_normalization_455: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6554853328->6555115248 6562564696 batch_normalization_460: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6561152584->6562564696 6786037688 block10_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6786235584->6786037688 6959266056 res4f_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 14, 14, 256) 6959140536->6959266056 6961299808 add_137: Add input: output: [(None, 14, 14, 1024), (None, 14, 14, 1024)] (None, 14, 14, 1024) 6959140536->6961299808 6555429800 activation_635: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6555115248->6555429800 6562815392 activation_640: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6562564696->6562815392 6786176728 block10_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6786037688->6786176728 6959739624 bn4f_branch2a: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6959266056->6959739624 6555588144 conv2d_456: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6555429800->6555588144 6562970480 conv2d_461: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6562815392->6562970480 6563825704 conv2d_462: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6563706248->6563825704 6826041640 block10_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6786176728->6826041640 6959824120 activation_704: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6959739624->6959824120 6554043504 batch_normalization_453: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6553246408->6554043504 6555859768 batch_normalization_456: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6555588144->6555859768 6563233072 batch_normalization_461: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6562970480->6563233072 6563892864 batch_normalization_462: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6563825704->6563892864 6825627320 block10_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6826041640->6825627320 6959879896 res4f_branch2b: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6959824120->6959879896 6554128056 activation_633: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6554043504->6554128056 6559002576 activation_636: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6555859768->6559002576 6563559408 activation_641: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6563233072->6563559408 6564387752 activation_642: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6563892864->6564387752 6826102400 block10_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6825627320->6826102400 6960139848 bn4f_branch2b: BatchNormalization input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6959879896->6960139848 6564388088 mixed7: Concatenate input: output: [(None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192), (None, 17, 17, 192)] (None, 17, 17, 768) 6554128056->6564388088 6559002576->6564388088 6563559408->6564388088 6564387752->6564388088 6826828520 block10_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6826102400->6826828520 6960458384 activation_705: Activation input: output: (None, 14, 14, 256) (None, 14, 14, 256) 6960139848->6960458384 6579734064 conv2d_465: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6564388088->6579734064 6564387808 conv2d_463: Conv2D input: output: (None, 17, 17, 768) (None, 17, 17, 192) 6564388088->6564387808 6586019400 max_pooling2d_24: MaxPooling2D input: output: (None, 17, 17, 768) (None, 8, 8, 768) 6564388088->6586019400 6826828520->6826494440 6960676816 res4f_branch2c: Conv2D input: output: (None, 14, 14, 256) (None, 14, 14, 1024) 6960458384->6960676816 6580009784 batch_normalization_465: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6579734064->6580009784 6785564120 block11_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6826494440->6785564120 6829974360 add_122: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6826494440->6829974360 6961104208 bn4f_branch2c: BatchNormalization input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6960676816->6961104208 6580420560 activation_645: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6580009784->6580420560 6826892256 block11_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6785564120->6826892256 6961104208->6961299808 6580484976 conv2d_466: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6580420560->6580484976 6827790232 block11_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6826892256->6827790232 6961469256 activation_706: Activation input: output: (None, 14, 14, 1024) (None, 14, 14, 1024) 6961299808->6961469256 6580743248 batch_normalization_466: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6580484976->6580743248 6827787208 block11_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6827790232->6827787208 6961536696 res5a_branch2a: Conv2D input: output: (None, 14, 14, 1024) (None, 7, 7, 512) 6961469256->6961536696 6969251600 res5a_branch1: Conv2D input: output: (None, 14, 14, 1024) (None, 7, 7, 2048) 6961469256->6969251600 6581106280 activation_646: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6580743248->6581106280 6827519616 block11_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6827787208->6827519616 6962022552 bn5a_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6961536696->6962022552 6581144936 conv2d_467: Conv2D input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6581106280->6581144936 6828390608 block11_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6827519616->6828390608 6962159512 activation_707: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6962022552->6962159512 6567400728 batch_normalization_463: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6564387808->6567400728 6584885032 batch_normalization_467: BatchNormalization input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6581144936->6584885032 6828051984 block11_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6828390608->6828051984 6962224824 res5a_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6962159512->6962224824 6567463624 activation_643: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6567400728->6567463624 6585203120 activation_647: Activation input: output: (None, 17, 17, 192) (None, 17, 17, 192) 6584885032->6585203120 6828527744 block11_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6828051984->6828527744 6962490784 bn5a_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6962224824->6962490784 6567464912 conv2d_464: Conv2D input: output: (None, 17, 17, 192) (None, 8, 8, 320) 6567463624->6567464912 6585345920 conv2d_468: Conv2D input: output: (None, 17, 17, 192) (None, 8, 8, 192) 6585203120->6585345920 6830320272 block11_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6828527744->6830320272 6963409080 activation_708: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6962490784->6963409080 6567726832 batch_normalization_464: BatchNormalization input: output: (None, 8, 8, 320) (None, 8, 8, 320) 6567464912->6567726832 6585613952 batch_normalization_468: BatchNormalization input: output: (None, 8, 8, 192) (None, 8, 8, 192) 6585345920->6585613952 6830320272->6829974360 6963539136 res5a_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 6963409080->6963539136 6579588008 activation_644: Activation input: output: (None, 8, 8, 320) (None, 8, 8, 320) 6567726832->6579588008 6585947136 activation_648: Activation input: output: (None, 8, 8, 192) (None, 8, 8, 192) 6585613952->6585947136 6827136448 block12_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6829974360->6827136448 6833172096 add_123: Add input: output: [(None, 19, 19, 728), (None, 19, 19, 728)] (None, 19, 19, 728) 6829974360->6833172096 6963805432 bn5a_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 6963539136->6963805432 6973783792 bn5a_branch1: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 6969251600->6973783792 6586269368 mixed8: Concatenate input: output: [(None, 8, 8, 320), (None, 8, 8, 192), (None, 8, 8, 768)] (None, 8, 8, 1280) 6579588008->6586269368 6585947136->6586269368 6586019400->6586269368 6830461560 block12_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6827136448->6830461560 6973963904 add_138: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 6963805432->6973963904 6973783792->6973963904 6595124584 conv2d_473: Conv2D input: output: (None, 8, 8, 1280) (None, 8, 8, 448) 6586269368->6595124584 6591954112 conv2d_470: Conv2D input: output: (None, 8, 8, 1280) (None, 8, 8, 384) 6586269368->6591954112 6598273232 average_pooling2d_44: AveragePooling2D input: output: (None, 8, 8, 1280) (None, 8, 8, 1280) 6586269368->6598273232 6591618912 conv2d_469: Conv2D input: output: (None, 8, 8, 1280) (None, 8, 8, 320) 6586269368->6591618912 6832127056 block12_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6830461560->6832127056 6974180992 activation_709: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 6973963904->6974180992 6595727992 batch_normalization_473: BatchNormalization input: output: (None, 8, 8, 448) (None, 8, 8, 448) 6595124584->6595727992 6831933256 block12_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6832127056->6831933256 6982101200 res5b_branch2a: Conv2D input: output: (None, 7, 7, 2048) (None, 7, 7, 512) 6974180992->6982101200 6984134952 add_139: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 6974180992->6984134952 6595800648 activation_653: Activation input: output: (None, 8, 8, 448) (None, 8, 8, 448) 6595727992->6595800648 6832076392 block12_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6831933256->6832076392 6982591152 bn5b_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6982101200->6982591152 6595800256 conv2d_474: Conv2D input: output: (None, 8, 8, 448) (None, 8, 8, 384) 6595800648->6595800256 6832795832 block12_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6832076392->6832795832 6982655168 activation_710: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6982591152->6982655168 6592805464 batch_normalization_470: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6591954112->6592805464 6596058752 batch_normalization_474: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6595800256->6596058752 6832389648 block12_sepconv3_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6832795832->6832389648 6982721376 res5b_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6982655168->6982721376 6593060256 activation_650: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6592805464->6593060256 6596385088 activation_654: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6596058752->6596385088 6832856704 block12_sepconv3: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6832389648->6832856704 6982979088 bn5b_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6982721376->6982979088 6593211248 conv2d_471: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6593060256->6593211248 6593959304 conv2d_472: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6593060256->6593959304 6596536024 conv2d_475: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6596385088->6596536024 6597475744 conv2d_476: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6596385088->6597475744 6833582712 block12_sepconv3_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6832856704->6833582712 6983449792 activation_711: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6982979088->6983449792 6593473840 batch_normalization_471: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6593211248->6593473840 6594381304 batch_normalization_472: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6593959304->6594381304 6596799960 batch_normalization_475: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6596536024->6596799960 6597806512 batch_normalization_476: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6597475744->6597806512 6598468944 conv2d_477: Conv2D input: output: (None, 8, 8, 1280) (None, 8, 8, 192) 6598273232->6598468944 6833582712->6833172096 6983520152 res5b_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 6983449792->6983520152 6591890432 batch_normalization_469: BatchNormalization input: output: (None, 8, 8, 320) (None, 8, 8, 320) 6591618912->6591890432 6593812464 activation_651: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6593473840->6593812464 6594568712 activation_652: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6594381304->6594568712 6597137240 activation_655: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6596799960->6597137240 6598053112 activation_656: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6597806512->6598053112 6598470848 batch_normalization_477: BatchNormalization input: output: (None, 8, 8, 192) (None, 8, 8, 192) 6598468944->6598470848 6834273752 block13_sepconv1_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6833172096->6834273752 6833646392 conv2d_490: Conv2D input: output: (None, 19, 19, 728) (None, 10, 10, 1024) 6833172096->6833646392 6983943448 bn5b_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 6983520152->6983943448 6591954336 activation_649: Activation input: output: (None, 8, 8, 320) (None, 8, 8, 320) 6591890432->6591954336 6594641872 mixed9_0: Concatenate input: output: [(None, 8, 8, 384), (None, 8, 8, 384)] (None, 8, 8, 768) 6593812464->6594641872 6594568712->6594641872 6598211456 concatenate_13: Concatenate input: output: [(None, 8, 8, 384), (None, 8, 8, 384)] (None, 8, 8, 768) 6597137240->6598211456 6598053112->6598211456 6598942560 activation_657: Activation input: output: (None, 8, 8, 192) (None, 8, 8, 192) 6598470848->6598942560 6834437144 block13_sepconv1: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6834273752->6834437144 6983943448->6984134952 6598939032 mixed9: Concatenate input: output: [(None, 8, 8, 320), (None, 8, 8, 768), (None, 8, 8, 768), (None, 8, 8, 192)] (None, 8, 8, 2048) 6591954336->6598939032 6594641872->6598939032 6598211456->6598939032 6598942560->6598939032 6835102832 block13_sepconv1_bn: BatchNormalization input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6834437144->6835102832 6984308496 activation_712: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 6984134952->6984308496 6603973744 conv2d_482: Conv2D input: output: (None, 8, 8, 2048) (None, 8, 8, 448) 6598939032->6603973744 6601749280 conv2d_479: Conv2D input: output: (None, 8, 8, 2048) (None, 8, 8, 384) 6598939032->6601749280 6606745272 average_pooling2d_45: AveragePooling2D input: output: (None, 8, 8, 2048) (None, 8, 8, 2048) 6598939032->6606745272 6599040416 conv2d_478: Conv2D input: output: (None, 8, 8, 2048) (None, 8, 8, 320) 6598939032->6599040416 6835166568 block13_sepconv2_act: Activation input: output: (None, 19, 19, 728) (None, 19, 19, 728) 6835102832->6835166568 6984375936 res5c_branch2a: Conv2D input: output: (None, 7, 7, 2048) (None, 7, 7, 512) 6984308496->6984375936 7029053144 add_140: Add input: output: [(None, 7, 7, 2048), (None, 7, 7, 2048)] (None, 7, 7, 2048) 6984308496->7029053144 6604032320 batch_normalization_482: BatchNormalization input: output: (None, 8, 8, 448) (None, 8, 8, 448) 6603973744->6604032320 6835224304 block13_sepconv2: SeparableConv2D input: output: (None, 19, 19, 728) (None, 19, 19, 1024) 6835166568->6835224304 6984849504 bn5c_branch2a: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6984375936->6984849504 6604651432 activation_662: Activation input: output: (None, 8, 8, 448) (None, 8, 8, 448) 6604032320->6604651432 6841091632 block13_sepconv2_bn: BatchNormalization input: output: (None, 19, 19, 1024) (None, 19, 19, 1024) 6835224304->6841091632 6984990504 activation_713: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6984849504->6984990504 6604586960 conv2d_483: Conv2D input: output: (None, 8, 8, 448) (None, 8, 8, 384) 6604651432->6604586960 6841272472 block13_pool: MaxPooling2D input: output: (None, 19, 19, 1024) (None, 10, 10, 1024) 6841091632->6841272472 6833886544 batch_normalization_490: BatchNormalization input: output: (None, 10, 10, 1024) (None, 10, 10, 1024) 6833646392->6833886544 6985055816 res5c_branch2b: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6984990504->6985055816 6602018320 batch_normalization_479: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6601749280->6602018320 6604901512 batch_normalization_483: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6604586960->6604901512 6841391368 add_124: Add input: output: [(None, 10, 10, 1024), (None, 10, 10, 1024)] (None, 10, 10, 1024) 6841272472->6841391368 6833886544->6841391368 6986493288 bn5c_branch2b: BatchNormalization input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6985055816->6986493288 6602496600 activation_659: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6602018320->6602496600 6605267296 activation_663: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6604901512->6605267296 6841391312 block14_sepconv1: SeparableConv2D input: output: (None, 10, 10, 1024) (None, 10, 10, 1536) 6841391368->6841391312 7028330848 activation_714: Activation input: output: (None, 7, 7, 512) (None, 7, 7, 512) 6986493288->7028330848 6602440544 conv2d_480: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6602496600->6602440544 6603177040 conv2d_481: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6602496600->6603177040 6605323680 conv2d_484: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6605267296->6605323680 6606055352 conv2d_485: Conv2D input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6605267296->6606055352 6841858088 block14_sepconv1_bn: BatchNormalization input: output: (None, 10, 10, 1536) (None, 10, 10, 1536) 6841391312->6841858088 7028456584 res5c_branch2c: Conv2D input: output: (None, 7, 7, 512) (None, 7, 7, 2048) 7028330848->7028456584 6602767160 batch_normalization_480: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6602440544->6602767160 6603427344 batch_normalization_481: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6603177040->6603427344 6605590368 batch_normalization_484: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6605323680->6605590368 6606331688 batch_normalization_485: BatchNormalization input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6606055352->6606331688 6607078344 conv2d_486: Conv2D input: output: (None, 8, 8, 2048) (None, 8, 8, 192) 6606745272->6607078344 6841689592 block14_sepconv1_act: Activation input: output: (None, 10, 10, 1536) (None, 10, 10, 1536) 6841858088->6841689592 7028722880 bn5c_branch2c: BatchNormalization input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 7028456584->7028722880 6601611080 batch_normalization_478: BatchNormalization input: output: (None, 8, 8, 320) (None, 8, 8, 320) 6599040416->6601611080 6603001472 activation_660: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6602767160->6603001472 6603765968 activation_661: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6603427344->6603765968 6605912552 activation_664: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6605590368->6605912552 6606801328 activation_665: Activation input: output: (None, 8, 8, 384) (None, 8, 8, 384) 6606331688->6606801328 6607080080 batch_normalization_486: BatchNormalization input: output: (None, 8, 8, 192) (None, 8, 8, 192) 6607078344->6607080080 6841901688 block14_sepconv2: SeparableConv2D input: output: (None, 10, 10, 1536) (None, 10, 10, 2048) 6841689592->6841901688 7028722880->7029053144 6601751632 activation_658: Activation input: output: (None, 8, 8, 320) (None, 8, 8, 320) 6601611080->6601751632 6603908712 mixed9_1: Concatenate input: output: [(None, 8, 8, 384), (None, 8, 8, 384)] (None, 8, 8, 768) 6603001472->6603908712 6603765968->6603908712 6606745160 concatenate_14: Concatenate input: output: [(None, 8, 8, 384), (None, 8, 8, 384)] (None, 8, 8, 768) 6605912552->6606745160 6606801328->6606745160 6607534512 activation_666: Activation input: output: (None, 8, 8, 192) (None, 8, 8, 192) 6607080080->6607534512 6842649736 block14_sepconv2_bn: BatchNormalization input: output: (None, 10, 10, 2048) (None, 10, 10, 2048) 6841901688->6842649736 7029259456 activation_715: Activation input: output: (None, 7, 7, 2048) (None, 7, 7, 2048) 7029053144->7029259456 6607472176 mixed10: Concatenate input: output: [(None, 8, 8, 320), (None, 8, 8, 768), (None, 8, 8, 768), (None, 8, 8, 192)] (None, 8, 8, 2048) 6601751632->6607472176 6603908712->6607472176 6606745160->6607472176 6607534512->6607472176 6842305672 block14_sepconv2_act: Activation input: output: (None, 10, 10, 2048) (None, 10, 10, 2048) 6842649736->6842305672 7029388400 avg_pool: AveragePooling2D input: output: (None, 7, 7, 2048) (None, 1, 1, 2048) 7029259456->7029388400 6607668392 global_average_pooling2d_13: GlobalAveragePooling2D input: output: (None, 8, 8, 2048) (None, 2048) 6607472176->6607668392 6842790968 global_average_pooling2d_14: GlobalAveragePooling2D input: output: (None, 10, 10, 2048) (None, 2048) 6842305672->6842790968 7029533272 global_average_pooling2d_15: GlobalAveragePooling2D input: output: (None, 1, 1, 2048) (None, 2048) 7029388400->7029533272 7042720656 concatenate_15: Concatenate input: output: [(None, 2048), (None, 2048), (None, 2048)] (None, 6144) 6607668392->7042720656 6842790968->7042720656 7029533272->7042720656 7042720208 dropout_3: Dropout input: output: (None, 6144) (None, 6144) 7042720656->7042720208 7042721216 dense_3: Dense input: output: (None, 6144) (None, 1) 7042720208->7042721216

读取HyperOpt生成的trial文件,找出最佳模型路径并读取。最后把模型全连接层权重赋予新模型

In [4]:
datafile = open("trial.pckl","rb")
trial = pickle.load(datafile)
datafile.close()

trial_sorted = sorted(trial.results, key=lambda t: t["eval"]["loss"]) 

path = trial_sorted[0]["path"]["model"]
best_model = load_model(path)
output_weights = best_model.layers[-1].get_weights()

model.layers[-1].set_weights(output_weights)
print("Done")
Done
In [5]:
total_count = 12500*2
In [6]:
x_res,y = load_train_data_original(224,total_count=total_count,split=False)
100%|██████████| 12500/12500 [01:27<00:00, 142.59it/s]
In [7]:
x_incXce,y= load_train_data_original(299,total_count=total_count,split=False)
100%|██████████| 12500/12500 [01:06<00:00, 187.67it/s]
In [8]:
y_pred = model.predict([x_incXce,x_res],batch_size=128,verbose=1)
25000/25000 [==============================] - 1066s 43ms/step
In [9]:
show_most_loss_img(x_incXce,y,y_pred,img_num=100)
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]: